Teradata- Load Utilities

23
Tera-Tom Tera-Cram for Teradata Basics V12: Understanding is the Key! by Tom Coffing Coffing Data Warehousing. (c) 2011. Copying Prohibited. Reprinted for N Kodanda Ramudu, Cognizant Technology Solutions [email protected] Reprinted with permission as a subscription benefit of Skillport, http://skillport.books24x7.com/ All rights reserved. Reproduction and/or distribution in whole or in part in electronic,paper or other forms without written permission is prohibited.

Transcript of Teradata- Load Utilities

Page 1: Teradata- Load Utilities

 

Tera-Tom Tera-Cram for Teradata Basics V12: Understanding is the Key!

by Tom Coffing Coffing Data Warehousing. (c) 2011. Copying Prohibited.

  

Reprinted for N Kodanda Ramudu, Cognizant Technology Solutions

[email protected]

Reprinted with permission as a subscription benefit of Skillport, http://skillport.books24x7.com/

All rights reserved. Reproduction and/or distribution in whole or in part in electronic,paper or other forms without written permission is prohibited.

Page 2: Teradata- Load Utilities

Chapter 14: Load Utilities

Teradata Load Utilities

“I don’t know who my grandfather was. I am more interested in who his grandson will become.” 

– Abraham Lincoln, 16th president of the United States

My son once told me he did not feel like studying. I said to him, “When Abraham Lincoln was your age, he studied by candlelight.” My son retorted, “When Abraham Lincoln was your age, he was president.” 

Data within a warehouse environment is often historic in nature, so the sheer volume of data can overwhelm many systems. But, not Teradata!

“Abraham Lincoln will go down as one of the greatest presidents in history, but Teradata is even better because it will not go down when it loads history.” 

– Tom Coffing, 1st president of Coffing Data Warehousing

Teradata is so advanced in the data-loading department that other database vendors can’t hold a candle to it. A Teradata data warehouse brings enormous amounts of data into the system. This is an area that most companies overlook when purchasing a data warehouse. Most company officials think loading of data is simply that – just loading data. Some people actually ask, “Are data loads that critical?” Come on, ASCII stupid question and get a stupid ANSI.

BTEQ was the first utility and query tool for Teradata. BTEQ can be used as a Query tool, to load data a row at a time into Teradata and to export data off of Teradata a row at a time.

FastLoad loads data into empty Teradata tables in 64 K blocks. You can’t have any Secondary Indexes, Referential Integrity, or Join Indexes. It is really FAST!

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 2 / 23

Page 3: Teradata- Load Utilities

Teradata Load Utilities Continued

“Good design can’t fix broken business models.”

- Jeffrey Veen

Where FastLoad is meant to populate empty tables with INSERTs, MultiLoad is meant to process INSERTs, UPDATEs, and DELETEs on tables that have existing data. MultiLoad is extremely fast. One major Teradata data warehouse company processes 120 million inserts, updates, and deletes nightly during its batch window.

The TPump utility is designed to allow OLTP transactions to immediately load into a data warehouse. When I started working with Teradata, more than 10 years ago, most companies loaded data on a monthly basis. Suddenly, companies began to load data weekly.

Today, most companies load data nightly, and industry leaders are loading data hourly. TPump is the beginning step of an “Active Data Warehouse (ADW)”. ADW combines OLTP transactions with the power of a Decision Support System (DSS).

The TPump utility theoretically acts like a water faucet. TPump can be set to full throttle to load millions of transactions during off peak hours or “turned down” to trickle small amounts of data during the data warehouse daily rush hour. It can also be automatically preset to load levels at certain times during the day, and can be modified at any time.

Also, TPump locks at a row level so users have access to the rest of the rows while the table is being loaded. Another advantage of this load utility is that it allows for multiple updates to be conducted on a table simultaneously.

FastExport is designed to Export data off of Teradata in 64K blocks. You should use BTEQ to do this if there is less than 500,000 rows and FastExport if there are more than 500,000 rows. That is the rule of thumb.

You will also hear about TPT. This is a combination of all the utilities in one common language.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 3 / 23

Page 4: Teradata- Load Utilities

BTEQ

“The philosophy exam was a piece of cake – which was a bit of a surprise, actually, because I was expecting some questions on a sheet of paper.”

- Smith & Jones

One of the most important things about BTEQ is learning how to pronounce it. It is pronounced Bee-Teeeek. BTEQ is a report writer where SQL Assistant and the Nexus Query Chameleon are more spreadsheet orientated. This allows BTEQ to do some special things with reports such as Sidetitles and the use of commands like With BY.

We will also get some experience using BTEQ to import and export data to and from Teradata.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 4 / 23

Page 5: Teradata- Load Utilities

The Four Types of BTEQ Exports

“If absolute power corrupts absolutely, does absolute powerlessness make you pure?”

- Harry Shearer

As you can see on the following page there are four types of BTEQ Exports. BTEQ allows for multiple techniques to export data. We usually think of an export as moving data off of Teradata to a normal flat file. That is example number one and that is called RECORD Mode.

Sometimes there are NULL’s in your data and if you export them to a mainframe the actual mainframe application could run into problems interpreting them. That is why INDICDATA is designed to place bits in the front of the records to warn of NULL values.

BTEQ can actually take your SQL output report and include the Headers and export all together. It looks like an electronic report. That is EXPORT REPORT mode.

The last mode is DIF and this is used when you want certain flat files to be able to be used by PC applications that utilize the Data Interchange Format.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 5 / 23

Page 6: Teradata- Load Utilities

BTEQ IMPORT Script

“We do not have censorship. What we have is a limitation on what newspapers can report.”

- Louis Nel, Deputy Minister of Information, South Africa

The following page shows an excellent example of a BTEQ IMPORT Script. We are taking data from our flat file called C:\Temp\CustData.txt and importing the records into the SQL_Class.Customer_Table.

FastLoad

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 6 / 23

Page 7: Teradata- Load Utilities

“Where there is no patrol car, there is no speed limit.” 

- Al Capone

FastLoad is known for its lightning-like speed in loading vast amounts of data from flat files from a host into empty tables in Teradata. Part of this speed is achieved because it does not use the Transient Journal. But, regardless of the reasons that it is fast, know that FastLoad was developed to load millions of rows into empty Teradata tables.

FastLoad loads data into EMPTY Teradata tables in 64 K Blocks. The only command FastLoad understands is INSERT! That is exactly what FastLoad does.

Rule #1: No Secondary Indexes are allowed on the Target Table. High performance will only allow FastLoad to utilize Primary Indexes when loading. The reason for this is that Primary (UPI and NUPI) indexes are used in Teradata to distribute the rows evenly across the AMPs and build only data rows. A secondary index is stored in a subtable block and many times on a different AMP from the data row. This would slow FastLoad down.

Rule #2: No Referential Integrity is allowed. FastLoad cannot load data into tables that are defined with Referential Integrity (RI). This would require too much system checking to prevent referential constraints to a different table. FastLoad only does one table. In short, RI constraints will need to be dropped from the target table prior to the use of FastLoad.

Rule #3: No Triggers are allowed at load time. FastLoad is much too focused on speed to pay attention to the needs of other tables, which is what Triggers are all about. Simply ALTER the Triggers to the DISABLED status prior to using FastLoad.

Rule #4: No AMPs may go down (i.e., go offline) while FastLoad is processing.The down AMP must be repaired before the load process can be restarted. Other than this, FastLoad can recover from system glitches and perform restarts. We will discuss Restarts later in this chapter.

Rule #5: No more than one data type conversion is allowed per column during a FastLoad. Why just one? Data type conversion is highly resource intensive job on the system, which requires a “search and replace” effort. And that takes more time. Enough said!

FastLoad has Two Phases

“Do you not know, my son, with what little understanding the world is ruled?”

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 7 / 23

Page 8: Teradata- Load Utilities

- Pope Julius III

FastLoad divides its job into two phases, both designed for speed. They have no fancy names but are typically known simply as Phase 1 and Phase 2. Sometimes they are referred to as Acquisition Phase and Application Phase.

The job of Phase 1 is to get the data off the mainframe or server and move it over the network inside Teradata. The data moves in 64 K blocks and is stored in worktables on the AMPs. The data rows are not on the correct AMP yet!

When all of the data has been moved from the server or mainframe flat file then each AMP will hash its worktable rows so each row transfers to the worktables on the proper destination AMP.

A Sample FastLoad Script

“Too bad all the people who know how to run the county are busy driving cabs and cutting hair.”

- George Burns

The following page shows you an example of a FastLoad script.

This script is designed to INSERT into an empty Teradata table called Employee_Table. This table exists in the database SQL01.

FastLoad will first logon. Then it will build the table structure (unless it already exists and is empty). Then it will begin loading, but it will always define two error tables. A checkpoint is optional.

Then the INSERT is performed and we are done.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 8 / 23

Page 9: Teradata- Load Utilities

The SmartScript FastLoad Builder

“Don’t worry about the world coming to an end today. It’s already tomorrow in Australia.”

- Charles Schultz

Most of the script is built already. The Nexus Query Chameleon is called the chameleon because it can change colors and fit in any environment. In this case the chameleon is Teradata Orange and has blended in perfectly. How? Nexus already knows the table and the database it exists in and has built the table definition inside your script for you.

The Chameleon has also placed two error files inside the script as required and gives you the ability to change the names and database should you desire.

You will need to tell the Chameleon the name of your flat file. The one you want to load fast into the empty Teradata table.

Now you are ready to hit the Build Script button.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 9 / 23

Page 10: Teradata- Load Utilities

MultiLoad

“The only limit to our realization of tomorrow will be our doubts of today.”

- Franklin D. Roosevelt

If we were going to be stranded on an island with a Teradata Data Warehouse and we could only take along one Teradata load utility, clearly, MultiLoad would be our choice. MultiLoad has the capability to load multiple tables at one time from either a LAN or Channel environment. This is in stark contrast to its fleet-footed cousin, FastLoad, which can only load one table at a time. And it gets better, yet!

This feature rich utility can perform multiple types of DML tasks, including INSERT, UPDATE, DELETE and UPSERT on up to five (5) empty or populated target tables at a time. These DML functions may be run either solo or in combinations, against one or more tables. For these reasons, MultiLoad is the utility of choice when it comes to loading populated tables in the batch environment. As the volume of data being loaded or updated in a single block, the performance of MultiLoad improves.

Unique Secondary Indexes are not supported on a Target Table. Like FastLoad, MultiLoad does not support Unique Secondary Indexes (USIs). But unlike FastLoad, it does support the use of Non-Unique Secondary Indexes (NUSIs) because the index subtable row is on the same AMP as the data row. MultiLoad uses every AMP independently and in parallel. If two AMPs must communicate, they are not independent. Therefore, a NUSI (same AMP) is fine, but a USI (different AMP) is not.

Referential Integrity is not supported. MultiLoad will not load data into tables that are defined with Referential Integrity (RI). Like a USI, this requires the AMPs to communicate with each other. So, RI constraints must be dropped from the target table prior to using MultiLoad.

Triggers Triggers cause actions on related tables based upon what happens in a target table. Again, this is a multi-AMP operation and to a different table. To keep MultiLoad running smoothly, disable all Triggers prior to using it.

No concatenation of input files is allowed. MultiLoad does not want you to do this because it could impact are restart if the files were concatenated in a different sequence or data was deleted between runs.

No Join Indexes. You must drop all Join Indexes before running a MultiLoad and then recreate them after the load is finished.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 10 / 23

Page 11: Teradata- Load Utilities

A Sample MultiLoad Script

“A friend is a gift you give yourself.”

- Robert Louis Stevenson

The script on the following page follows these steps:

n Setting up a Logtable

n Logging onto Teradata

n Identifying the Target, Work and Error tables

n Defining the INPUT flat file

n Defining the DML activities to occur

n Naming the IMPORT file

n Telling MultiLoad to use a particular LAYOUT

n Telling the system to start loading

n Finishing loading and logging off of Teradata

This first script example is designed to show MultiLoad IMPORT in its simplest form. It depicts the updating of the Employee table. The actual script is in the left column and our comments are on the right.

Step One:Setting up a Logtable and Logging onto Teradata — MultiLoad requires you specify a log table right at the outset with the .LOGTABLE command. Immediately after this you log onto Teradata using the .LOGON command.

Step Two:Identifying the Target, Work and Error tables — In this step of the script you must tell Teradata which TABLES, WORKTABLES AND ERROR TABLES to use. All you must do is name the tables and specify what database they are in. Work tables and error tables are created automatically for you.

Step Three: Defining the INPUT flat file record structure — MultiLoad is going to need to know the structure the INPUT flat file.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 11 / 23

Page 12: Teradata- Load Utilities

Step Four:Defining the DML activities to occur — The .DML LABEL names and defines the SQL that is to execute. In this example we are going to UPDATE.

Step Five:Naming the INPUT file and its format type. This step is vital! Using the

.IMPORT command, we have identified the INFILE data as being contained in a file called “mload_flat_file.txt”. Next, we referenced the LAYOUT named FileColDesc1 to describe the fields in the record. Finally, we told MultiLoad to APPLY the DML LABEL called EMP_UPD.

The SmartScript MultiLoad Builder

“Good communication is as stimulating as black coffee and just as hard to sleep after.”

- Anne Morrow Lindbergh, 'Gift from the Sea‘

The MultiLoad Script Builder will have already built much of your script for you. Notice on the following page the red arrows that show the different tabs. You merely move through each tab and fill in the MultiLoad parameters. Most of the parameters are already set for you with the defaults. You can change the parameters.

When you are ready hit Build Script where you can see the entire script and still make changes to the script.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 12 / 23

Page 13: Teradata- Load Utilities

TPump

“He who is of calm and happy nature will hardly feel the pressure of age, but to him who is of an opposite disposition youth and age are equally a burden.”

- Plato (427 BC – 347 BC), the Republic

Do you remember the first Swiss ArmyTM knife you ever owned? Aside from its original intent as a compact survival tool,

this knife has thrilled generations with its multiple capabilities. TPump is the Swiss ArmyTM knife of the Teradata load utilities. Just as this knife was designed for small tasks, TPump was developed to handle batch loads with low volumes.

And, just as the Swiss ArmyTM knife easily fits in your pocket when you are loaded down with gear, TPump is a perfect fit when you have a large, busy system with few resources to spare. Let’s look in more detail at the many facets of this amazing load tool.

Why It Is Called “TPump” 

TPump is the shortened name for the load utility Teradata Parallel Data Pump. To understand this, you must know how the load utilities move the data. Both FastLoad and MultiLoad assemble massive volumes of data rows into 64K blocks and then moves those blocks. Picture in your mind the way that huge ice blocks used to be floated down long rivers to large cities prior to the advent of refrigeration. There they were cut up and distributed to the people. TPump does NOT move data in the large blocks. Instead, it loads data one row at a time, using row hash locks. Because it locks at this level, and not at the table level like MultiLoad, TPump can make many simultaneous, or concurrent, updates on a table.

Envision TPump as the water pump on a well; pumping in a very slow, gentle manner resulting in a steady trickle of water that could be pumped into a cup. But strong and steady pumping results in a powerful stream of water that would require a larger container. TPump is a data pump which, like the water pump, may allow either a trickle-feed of data to flow into the warehouse or a strong and steady stream. In essence, you may “throttle” the flow of data based upon your system and business user requirements. Remember, TPump is THE PUMP! Look on the next page at all the great things Tpump can do.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 13 / 23

Page 14: Teradata- Load Utilities

Limitations of TPump

“To keep the heart unwrinkled,, to be hopeful, kindly, cheerful, and reverent – that is to triumph over old age.”

- Thomas Bailey Aldrich, O Magazine, October 2003

TPump has rightfully earned its place as a superstar in the family of Teradata load utilities. But this does not mean that it has no limits. It has a few that we will list here for you:

Rule #1: No concatenation of input data files is allowed. TPump is not designed to support this.

Rule #2: TPump will not process aggregates, arithmetic functions or exponentiation. If you need data conversions or math, you might consider using an INMOD to prepare the data prior to loading it.

Rule #3: The use of the SELECT function is not allowed. You may not use SELECT in your SQL statements.

Rule #4: No more than four IMPORT commands may be used in a single load task. This means that at most, four files can be directly read in a single run.

Rule #5: Dates before 1900 or after 1999 must be represented by the yyyy format for the year portion of the date, not the default format of yy. This must be specified when you create the table. Any dates using the default yy

format for the year are taken to mean 20th century years.

Rule #6: On some network attached systems, the maximum file size when using TPump is 2GB. This is true for a computer running under a 32-bit operating system.

Rule #7: TPump performance will be diminished if Access Logging is used.The reason for this is that TPump uses normal SQL to accomplish its tasks. Besides the extra overhead incurred, if you use Access Logging for successful table updates, then Teradata will make an entry in the Access Log table for each operation. This can cause the potential for row hash conflicts between the Access Log and the target tables.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 14 / 23

Page 15: Teradata- Load Utilities

A Sample TPump Script

“Computer Science is no more about computers than astronomy is about telescopes.”

- E. W. Dijkstra

The script on the following page follows these steps:

n Setting up a Logtable

n Logging onto Teradata

n Identifying the Target, Work and Error tables

n Defining the INPUT flat file

n Defining the DML activities to occur

n Naming the IMPORT file

n Telling TPump to use a particular LAYOUT

n Telling the system to start loading

n Finishing and log off of Teradata

This first script example is designed to show a TPump in its simplest form. It depicts the Inserting of the Employee table. The actual script is in the left column and our comments are on the right.

Step One:Setting up a Logtable and Logging onto Teradata — TPump requires you specify a log table right at the outset with the .LOGTABLE command. Immediately after this you log onto Teradata using the .LOGON command.

Step Two:BEGIN LOAD and then define the target table and worktable — In this step of the script you must tell Teradata which TABLE is the target table AND you define the name of the ERROR TABLE to use. All you must do is name the tables and specify what database they are in. Error tables are created automatically for you.

Step Three: Defining the INPUT flat file record structure — TPump is going to need to know the structure the INPUT flat file.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 15 / 23

Page 16: Teradata- Load Utilities

Step Four:Defining the DML activities to occur — The DML LABEL names and defines the SQL that is to execute. In this example we are going to INSERT.

Step Five:Naming the INPUT file and its format type. This step is vital! Using the .IMPORT command, we have identified the INFILE name, pointed to the layout and told the script which labels to APPY.

The SmartScript TPump Builder

“The only difference between a rut and a grave… is in their dimensions.”

- Ellen Glasglow

The TPump Script Builder will have already built much of your script for you. Notice on the following page the red arrows that show the different tabs. You merely move through each tab and fill in the TPump parameters. Most of the parameters are already set for you with the defaults. You can change the parameters.

When you are ready hit Build Script where you can see the entire script and still make changes to the script.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 16 / 23

Page 17: Teradata- Load Utilities

FastExport

“I saw the angel in the marble and I carved until I set it free.” 

- Michelangelo

FastExport exports data off of Teradata to a Flat file. This function is also done by BTEQ, but BTEQ exports data in rows. FastExport exports data in 64K blocks so when speed is necessary then FastExport is the export tool of choice. FastExport uses a SELECT statement to know what data to extract and the utility to handle the extraction.

Because FastExport is a 64K block utility it falls under the limit of 15 block utilities. That means that a system can’t have more than a combination of 15 FastLoads, MultiLoads, and FastExports.

The Next page shows a sample FastExport script.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 17 / 23

Page 18: Teradata- Load Utilities

Teradata Parallel Transport

“The young have aspirations that never come to pass, the old have reminiscences of what never happened.”

- Saki (1870-1916)

The Teradata Parallel Transport (TPT) utility combines BTEQ, FastLoad, MultiLoad, Tpump, and FastExport utilities into one comprehensive language utility. This allows TPT to insert data to tables, export data from tables, and update tables.

TPT works around the concept of Operators and Data Streams. There will be an Operator to read Source data, pass the contents of that Source to a data stream where another operator will be responsible for taking the Data Stream and loading it to disk. Notice on the following page that we have a Flat File that is our Source. A Producer Operator, designed to read input will move the data to a Data Stream. The Consume Operator, designed to write data to a Teradata table will then Load the data.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 18 / 23

Page 19: Teradata- Load Utilities

TPT Operators and their Functions

“You’re never too old to become younger.”

- Mae West (1892-1980)

The picture on the following page is starting to get to the meat of TPT. Here are three operators in the Producer or READ Operator, the Filter Operator or TRANSFORM Operator and the Consumer Operator, or WRITE Operator.

The Producer Operator can read Queues, Files, Relational Databases, and Non-Relational Sources.

The Filter Operators can Transform data from INMODs, WHERE Clauses, APPLY Filters, and User-Defined functions.

The Consumer Operator can perform INSERTS (Load), Updates, SQL Inserts, and Tpump like Streams.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 19 / 23

Page 20: Teradata- Load Utilities

TPT Operator Types

“Never take the advice of someone who has not had your kind of trouble.”

- Sidney J. Harris

The following page shows the four operator types.

TPT Operators and their Equivalent Load Utility

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 20 / 23

Page 21: Teradata- Load Utilities

“Martyrdom…is the only way in which a man can become famous without ability.”

- George Bernard Shaw (1856-1950), The Devil‘s Disciple (1901) act 3

The slide on the following page shows the TPT Operator and their equivalent Teradata Load Utility.

How to Run a TPT Script

“I’ll moider da bum.”

- Tony Galento, heavyweight boxer, when asked what he thought of William Shakespeare

The easiest way to run a TPT script is to use the TBuild utility. You first create your script and then run TBuild, passing TBuild the script name to run.

In the example on the following page it shows the creation of a script called ScriptName.txt. This is not a complete script, but just enough to show you the idea that we have created a TPT script.

Then below you can see we ran the TBuild –f command, which says run TPT on the script your about to see. TPT will then run on the script ScriptName.txt in the C:\temp directory.

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 21 / 23

Page 22: Teradata- Load Utilities

Convert Tables DDL

The DDL Converter allows you to convert Teradata tables to any other database vendor. Below we have converted a Teradata Table into a SQL Server table. Nexus is so advanced that you can literally convert table structures from any vendor to any other vendor like magic. You can download a free trial of the Nexus Query Chameleon at www.CoffingDW.com.

Watch the Tera-Tom Video on his trip to India

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 22 / 23

Page 23: Teradata- Load Utilities

You can watch the video by clicking on the link below or copying and pasting the link in your browser. If you ever get the chance to go to India you will be so lucky. I have taught over there three times and it was the trip of a lifetime. All film was created, edited, and produced by me (your man Tera-Tom) and I also provide some advice and last words about learning Teradata. Be prepared to be amazed at what you see.

Http://www.CoffingDW.com/TBasicsV12/india.wmv

The END

Tera­Tom Tera­Cram for Teradata Basics V12: Understanding is the Key!

Reprinted for CTS/215337, Cognizant Technology Solutions Coffing Data Warehousing, Coffing Publishing (c) 2011, Copying Prohibited

Page 23 / 23