You are on page 1of 30

Module 2: BTEQ

After completing this module, you will be able to: Use .EXPORT to SELECT data from the Teradata database to another computer. State the purpose of the four types of BTEQ EXPORT. Use .IMPORT to process input from a host-resident data file. Use Indicator Variables to preserve NULLs. Describe multiple sessions, and how they make parallel access possible.

BTEQ

Batch-mode utility for submitting SQL requests to the Teradata database. Runs on every supported platform laptop to mainframe.

Flexible and easy-to-use report writer.


Exports data to a client system from the Teradata database:

As displayable characters suitable for reports, or In native host format, suitable for other applications.

Reads input data and imports it to the Teradata database as INSERTs, UPDATEs or DELETEs.
Limited ability to branch forward to a LABEL, based on a return code or an activity count. BTEQ does error reporting, not error capture.

Using BTEQ Conditional Logic


The Bank offers a number of special services to its Million-Dollar customers.
DELETE FROM Million_Dollar_Customer ALL; .IF ERRORCODE = 0 THEN .GOTO TableOK CREATE TABLE Million_Dollar_Customer (Account_Number INTEGER ,Customer_Last_Name VARCHAR(20) ,Customer_First_Name VARCHAR(15) ,Balance_Current DECIMAL(9,2)); .LABEL TableOK INSERT INTO Million_Dollar_Customer SELECT A.Account_Number, C.Last_Name, C.First_Name, A.Balance_Current FROM Accounts A INNER JOIN Account_Customer AC INNER JOIN Customer C ON C.Customer_Number = AC.Customer_Number ON A.Account_Number = AC.Account_Number WHERE A.Balance_Current GT 1000000; .IF ACTIVITYCOUNT > 0 THEN .GOTO Continue .QUIT .LABEL Continue DELETE all rows from the Million_Dollar_Customer table. IF this results in an error (non-zero), THEN create the table, ELSE attempt to populate using INSERT/SELECT. IF some rows are inserted (ACTIVITYCOUNT>0) THEN arrange services, ELSE terminate the job.

BTEQ Error Handling


.SET ERRORLEVEL

2168
UNKNOWN ............. ............. ;

SEVERITY 4,
SEVERITY 16

(2173, 3342, 5262) SEVERITY 8 .SET ERRORLEVEL SELECT FROM

.IF ERRORLEVEL >= 14 THEN .QUIT 17 ;

You can assign an error level (SEVERITY) for each error code returned and make decisions based on the level you assign. Capabilities:

Customize mapping from error code to ERRORLEVEL. Global ERRORLEVEL threshold for deferring to .IF or terminating. .SET MAXERROR <integer> defines termination threshold.

BTEQ EXPORT
export1.btq

BTEQ Script
Note: For a channel-attached host, the output file is specified as DDNAME instead of FILE.

.LOGON tdp1/user1,passwd1 .EXPORT DATA FILE=/home2/user1/datafile1

SELECT FROM WHERE

Account_Number Accounts Balance_Current LT 100;

.EXPORT RESET .QUIT

Default Output BTEQ


bteq < export1.btq

Logon complete 1200 Rows returned Time was 15.25 seconds 12348009 19450824 23498763 23748091 85673542 19530824 92234590 :

datafile1

Data file of Account Numbers

BTEQ .EXPORT
.EXPORT DATA INDICDATA REPORT DIF
DATALABELS

FILE = DDNAME

filename , LIMIT = n ddname =

RESET A , OPEN , CLOSE .EXPORT DATA .EXPORT INDICDATA .EXPORT REPORT AXSMOD modname 'init_string'

.EXPORT DIF

.EXPORT RESET LIMIT n OPEN/CLOSE AXSMOD

Sends results to a host file in record mode. Sends query results that contain indicator variables to a host file. Allows Host programs to deal with nulls. Sends results to a host file in field mode. Data set contains column headings and formatted data. Data is truncated if exceeds 254 (REPORT). Output converted to Data Interchange Format, used to transport data to various PC programs, such as Lotus 1-2-3. Reverses the effect of a previous .EXPORT and closes the output file. Sets a limit on number of rows captured. Output Data Set or File is either OPEN or CLOSEd during RETRY Access module used to export to tape

BTEQ .EXPORT Script Example


export2.btq
.LOGON tdp1/user1,passwd1 .EXPORT DATA FILE=/home2/user1/datafile2, LIMIT=100 SELECT Account_Number FROM Accounts WHERE Balance_Current < 500 ; .EXPORT RESET .QUIT

To execute this script in a UNIX environment:


$ bteq < export2.btq | tee export2.out

*** Success, Stmt# 1 ActivityCount = 330 *** Query completed. 330 rows found. 1 column returned. *** Total elapsed time was 1 second.
*** Warning: RetLimit exceeded. Ignoring the rest of the output. datafile2 contains 100 account numbers export2.out contains the output informational text sent to the console

*** Output returned to console


$

BTEQ Data Modes


Field mode is set by : .EXPORT REPORT
Column A 1 4 7 Column B 2 5 8 Column C 3 6 9 Transfers data one column at a time with numeric data converted to character.

Record mode is set by : .EXPORT DATA


f1 f1 f1 f2 f2 f2 f3 f3 f3 Transfers data one row at a time in host format. Nulls are represented as zeros or spaces.

Indicator mode is set by: .EXPORT INDICDATA


Indic. Byte(s) Indic. Byte(s) Indic. Byte(s) f1 f1 f1 f2 f2 f2 f3 f3 f3 Transfers data one row at a time in host format, sending an indicator variable for nulls. Nulls are represented as zeros or spaces.

Indicator Variables
Indicator variables allow utilities to process records that contain NULL indicators. .EXPORT INDICDATA .[SET] INDICDATA [ON] INDICATORS ON Bteq

FastLoad MultiLoad FastExport TPump

INDICATORS

INDICATORS causes leading n bytes of the record as NULL indicators instead of data.
NULL Columns

00101000 00000000

F1

F2

F3

F4

F5

F6

F12

Field 3 is null, Field 5 is null

Determining the Logical Record Length with Fixed Length Columns


Length CREATE TABLE Customer, FALLBACK (Customer_Number INTEGER ,Last_Name CHAR(8) ,First_Name CHAR(8) ,Social_Security INTEGER ,Birth_Date DATE ,OD_Limit DECIMAL(7,2)) UNIQUE PRIMARY INDEX (Customer_Number);

4 8 8 4 4 4

Total 32

J 0 0 1 0 0 0 0 2 3 0 6 4 D 1 5

o 9 6 6

n 9 5 7

e 8 5 8

s A 2 4 0 4 0 4 0

J D 1

a 8 1

c 8 3

k 9 2 4 0 4 0 4 0 4 0 0 0 0 0 0 0 0 0 0 0 0 9 7 3 ...

A B

9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 First Name Social Security Birth Date

Customer #

Last Name

Determining the Logical Record Length with Variable Length Columns


Length

CREATE TABLE Customer, FALLBACK (Customer_Number INTEGER ,Last_Name VARCHAR(8) ,First_Name VARCHAR(8) ,Social_Security INTEGER ,Birth_Date DATE ,OD_Limit DECIMAL(7,2)) UNIQUE PRIMARY INDEX (Customer_Number);

4 10 10 4 4 4

Total 36 Last_Name and First_Name redefined each as VARCHAR(8) reduces storage by 7 spaces, but adds two 2-byte length fields.
J 0 0 1 0 0 0 0 0 6 4 0 0 5 0 D 5 1 6 7 o 9 6 8 n 9 5 e 8 5 s A 2 0 0 0 4 J D 1 a 8 1 c 8 3 k 9 2 0 0 0 0 0 0 0 0 0 0 0 0 7 3 1 0 0 0 0 0 0 C

A B

2 3

9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 First Name Social Security Birth Date OD Limit

Customer #

Last Name

Determining the Logical Record Length with .EXPORT INDICDATA


Length

CREATE TABLE Customer, FALLBACK (Customer_Number INTEGER ,Last_Name VARCHAR(8) ,First_Name VARCHAR(8) ,Social_Security INTEGER ,Birth_Date DATE ,OD_Limit DECIMAL(7,2)) UNIQUE PRIMARY INDEX (Customer_Number);

4 10 10 4 4 4

Assume NULL for Social Security

Total 37
Indicator Byte

All 6 columns are nullable adding 6 bits (1 byte) when using .EXPORT in INDICDATA mode. Therefore, the length equals 37. Assume in this example that Social Security is NULL. J o 9 6 n 9 5 e 8 5 s A 2 0 0 0 4 J D 1 a 8 1 c 8 3 k 9 2 0 0 0 0 0 0 0 0 0 0 0 0 7 3 1 0 0 0 ... ...

1 0 1

0 0

0 0

0 0 4

0 6 5

0 0 6

0 D 5 1 7 8

A B

2 3

9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 Last Name First Name Social Security Birth Date

Customer #

.IMPORT
(for Channel-Attached Systems)
IMPORT loads data from the host to the Teradata database with a USING clause. INDICDATA preserves nulls.
.IMPORT DATA INDICDATA DDNAME FILE ddname = ,SKIP = n

.IMPORT

DATA

Reads a host file in record mode. Reads data in host format using indicator variables in record mode to identify nulls.

.IMPORT INDICDATA

DDNAME

Name of MVS JCL DD statement or CMS FILEDEF.


Name of input data set in all other environments. Number of initial records from the data stream that should be skipped before the first row is transmitted.

FILE SKIP = n

.IMPORT
(for Network-Attached Systems)
.IMPORT DATA INDICDATA REPORT VARTEXT | 'c' FILE DDNAME filename

,SKIP = n

AXSMOD modname 'init_string'

DATA

imports data from the server to Teradata with a USING clause.

INDICDATA import records contain NULL bits. REPORT imports Teradata report data. Data expected in BTEQ EXPORT REPORT format.

VARTEXT

record format as variable length character fields. Default delimiter is | or specify with field delimiter within single quotes.
access module used to import from tape.

AXSMOD

BTEQ IMPORT
(Data Load from the Host)
.LOGON tdp1/user1,passwd1 .IMPORT DATA DDNAME = datain3 ; .QUIET ON .REPEAT * USING in_CustNo (INTEGER) , in_SocSec (INTEGER) , Filler (CHAR(30)) , in_Lname (CHAR(20)) , in_Fname (CHAR(10)) INSERT INTO Customer ( Customer_Number , Last_Name , First_Name , Social_Security ) VALUES ( :in_CustNo , :in_Lname , :in_Fname , :in_SocSec) ; .QUIT .QUIET Limits output to reporting only errors and request processing statistics. .REPEAT * Causes BTEQ to read records until EOF. USING Defines the input data from the host.

BTEQ IMPORT
(from a UNIX Environment)
bteq Enter your BTEQ Command: .RUN FILE = jobscript.btq

or
bteq < jobscript.btq | tee jobscript.out .QUIET Limits output to reporting only errors and request processing statistics. .REPEAT * Causes BTEQ to read records until EOF. USING Defines the input data from the UNIX server.

jobscript.btq
. LOGON tpd1/user1,passwd1 . IMPORT DATA FILE = /home2/user1/datafile4 . QUIET ON . REPEAT * USING in_CustNo (INTEGER) , in_SocSec (INTEGER) UPDATE Customer SET Social_Security = :in_SocSec WHERE Customer_Number = :in_CustNo ; .QUIT;

BTEQ IMPORT
(from a PC-Connected LAN)
From Teradata Command Prompt on PC:
c:\ bteq Teradata BTEQ 08.02.00.00 for WIN32. Enter your logon or BTEQ command: .RUN FILE = c:\td_scripts\import1.btq

import1.btq
. LOGON tdp1/user1,passwd1 . IMPORT DATA FILE = E:\datafile5 . QUIET ON . REPEAT * USING in_CustNo (INTEGER) ,in_SocSec (INTEGER) UPDATE Customer SET Social_Security = :in_SocSec WHERE Customer_Number = :in_CustNo ; .QUIT

Multiple Sessions
Session: Logical connection between host and Teradata database. Workstream composed of a series of requests between the host and the
database.

Multiple sessions: Allow tasks to be worked on in parallel. Require row hash locking for parallel processing: UPI, NUPI, USI
transactions.

Too few degrade performance. Too many will not improve performance. Initializing a single session typically takes 1 to 2 seconds.

.SET SESSIONS
.SET SESSIONS 8 .LOGON tdp1/user1,passwd1 .IMPORT DATA DDNAME=datain6 .QUIET ON .REPEAT * USING in_CustNo (INTEGER) , in_SocSec (INTEGER) , Filler (CHAR(30)) , in_LName (CHAR(20)) , in_FName (CHAR(10)) INSERT INTO Customer ( Customer_Number , Last_Name , First_Name , Social_Security ) VALUES ( :in_CustNo ,:in_LName ,:in_FName ,:in_SocSec ) ; .QUIT

Parallel Processing Using Multiple Sessions to Access Individual Rows


TXN 3

Although a single row can reside on 1 AMP, it might require a full table scan to locate it. All AMPs are required to participate.

TXN 2 TXN 1

TXN 2

TXN 2

TXN 2

VDisk

VDisk

VDisk

VDisk

If the location of the row is known, only the applicable AMP needs to be involved. Other AMPs can work on other tasks multiple sessions are useful.

TXN 9

TXN 10

TXN 11

TXN 12

TXN 5
TXN 1

TXN 6
TXN 2

TXN 7
TXN 3

TXN 8
TXN 4

VDisk

VDisk

VDisk

VDisk

Multiple transactions execute in parallel, provided that:

Each transaction uses fewer than all AMPs. Enough are sent to keep ALL AMPs busy. Each parallel transaction has a unique internal ID.

When Do Multiple Sessions Make Sense?


Multiple sessions improve performance ONLY for SQL requests that impact fewer than ALL AMPs.
TRANS_HISTORY Trans_Number Trans_Date Account_Number Trans_ID Amount FK,NN NUPI

PK USI NUSI

Which of the following batch requests would benefit from multiple sessions?
Trans Type 1. INSERT INTO Trans_History VALUES (:T_Nbr, DATE, :Acct_Nbr, :T_ID, :Amt); Table or Row Lock Multiple Sessions Useful or Not?

2. SELECT * FROM Trans_History WHERE Trans_Number=:Trans_Number;


3. DELETE FROM Trans_History WHERE Trans_Date < DATE - 120; 4. DELETE FROM Trans_History WHERE Account_Number= :Account_Number;

When Do Multiple Sessions Make Sense?


Multiple sessions improve performance ONLY for SQL requests that impact fewer than ALL AMPs.
TRANS_HISTORY Trans_Number Trans_Date Account_Number Trans_ID Amount FK,NN NUPI

PK USI NUSI

Which of the following batch requests would benefit from multiple sessions?
Trans Type 1. INSERT INTO Trans_History VALUES (:T_Nbr, DATE, :Acct_Nbr, :T_ID, :Amt); NUPI Table or Row Lock Row Hash Multiple Sessions Useful or Not? Yes

2. SELECT * FROM Trans_History WHERE Trans_Number=:Trans_Number;


3. DELETE FROM Trans_History WHERE Trans_Date < DATE - 120; 4. DELETE FROM Trans_History WHERE Account_Number= :Account_Number;

NUSI

Full Table

No

FTS

Full Table

No

NUPI

Row Hash

Yes

Application Utility Checklist


Feature BTEQ FastLoad FastExport MultiLoad TPump

DDL Functions
DML Functions Multiple DML Multiple Tables Multiple Sessions Protocol Used Conditional Expressions Arithmetic Calculations

ALL
ALL Yes Yes Yes SQL Yes Yes

Data Conversion
Error Files Error Limits User-written Routines

Yes
No No No

Review Questions
Answer True or False. 1. True or False. With BTEQ you can import data from the host to Teradata AND export from Teradata to the host. 2. True or False. .EXPORT DATA sends results to a host file in field mode. 3. True or False. INDICDATA is used to preserve nulls. 4. True or False. With BTEQ, you can use conditional logic to bypass statements based on a test of an error code. 5. True or False. It is useful to employ multiple sessions when ALL AMPS will be used for the transaction. 6. True or False. With .EXPORT, you can have output converted to a format that can be used with PC programs.

Module 2: Review Question Answers


Answer True or False. 1. True or False. With BTEQ you can import data from the host to Teradata AND export from Teradata to the host. 2. True or False. .EXPORT DATA sends results to a host file in field mode. (Results are in record mode.) 3. True or False. INDICDATA is used to preserve nulls. 4. True or False. With BTEQ, you can use conditional logic to bypass statements based on a test of an error code. 5. True or False. It is useful to employ multiple sessions when ALL AMPS will be used for the transaction. (It is useful when fewer than all AMPs are used.) 6. True or False. With .EXPORT, you can have output converted to a format that can be used with PC programs.

Lab Exercises
Lab Exercise 2-1
Purpose In this lab, you will use BTEQ to perform imports with different numbers of sessions. You will move selected rows from the AU.Accounts table to your personal Accounts table and from a data file to your table. You will repeat tasks using different numbers of sessions. What you need Populated AU.Accounts table and your empty Accounts table Tasks 1. INSERT/SELECT all rows from the populated AU.Accounts table to your own userid.Accounts table. Note the timing and verify that you have the correct number of rows. Time: Number of rows: 2. Export 1000 rows to a data file (data2_1). 3. Delete all rows from your userid.Accounts table. 4. Import the rows from your data set (data2_1) to your empty userid.Accounts table. Note the time and verify the number of rows. Time: Number of rows: 5. Delete all the rows from your userid.Accounts table again.

Lab Exercises (cont.)


Lab Exercise 2-1 (cont.)
Tasks 6. Specify 8 sessions and import the rows from your data set to your empty userid.Accounts table. Note the time and verify the number of rows. Time: Number of rows: 7. Delete all the rows from your userid.Accounts table again. 8. Specify 210 sessions and import the rows from your data set to your empty userid.Accounts table. Note the timing and verify the number of rows. Time: Number of rows: 9. What are your conclusions based on the tasks you have just performed? _______________________________________________________________________________________ _______________________________________________________________________________________

Lab Exercises (cont.)


Lab Exercise 2-2
Purpose In this exercise, you will use BTEQ to select 500 rows from the AU.Customer table, representing a specific set of 500 customers. First, you will use .EXPORT DATA to build a data set that contains 500 customer numbers and use this as input to access the Customer table. and use .EXPORT REPORT to generate a report file. What you need Populated AU.Customer table Tasks 1. From the AU.Customer table, export to a data file (data2_2a) the 500 customer numbers for the customers that have the highest Social Security numbers. (Hint: You will need to use descending order for Social Security numbers.) 2. Using the 500 customer numbers (in data2_2a) to select the 500 appropriate rows from AU.Customer, export a report file named data2_2b. In your report you will need the fields: Customer_Number, Last_Name, First_Name, Social_Security. Hint: You will .IMPORT DATA from data2_2a and use .EXPORT REPORT to data2_2b. 3. View your report. The completed report should look like this: Customer_Number Last_Name First_Name Social_Security 2001 Smith John 123456789

4. What are highest and lowest Social Security numbers in your report? Highest: ________________________ Lowest: ________________________

Lab Solutions for Lab 2-1


Lab Exercise 2-1
cat lab216.btq

.SESSIONS 8
.LOGON u4455/tljc30,tljc30 .IMPORT DATA FILE = data2_1 .QUIET ON .REPEAT * USING in_account_number (INTEGER), in_number (INTEGER), in_street (CHAR(25)), in_city (CHAR(20)), in_state (CHAR(2)), in_zip_code (INTEGER), in_balance_forward (DECIMAL(10,2)), in_balance_current (DECIMAL(10,2)) INSERT INTO Accounts VALUES (:in_account_number , :in_number , :in_street , :in_city , :in_state , :in_zip_code , :in_balance_forward, :in_balance_current); .QUIT bteq < lab216.btq

Lab Solutions for Lab 2-2


Lab Exercise 2-2
cat lab222.btq .LOGON u4455/tljc20,tljc20 .IMPORT DATA FILE = data2_2a .EXPORT REPORT FILE = data2_2b .WIDTH 80 .REPEAT * USING in_customer_number (INTEGER) SELECT customer_number, last_name (CHAR(10)), first_name (CHAR(10)), social_security FROM AU.Customer WHERE customer_number = :in_customer_number; .EXPORT RESET .QUIT more data2_2b
CUSTOMER_NUMBER -------------------------------9000 : CUSTOMER_NUMBER -------------------------------8501 LAST_NAME ----------------Underwood : LAST_NAME ----------------Atchison FIRST_NAME ------------------Anne : FIRST_NAME ------------------Jose SOCIAL_SECURITY ---------------------------213633756 : SOCIAL_SECURITY ---------------------------213631261

You might also like