The Teradata Destination Component is an SSIS Data Flow Component which connects to a Teradata database and bulk loads data into Teradata database tables. It is up to 80x faster than the standard loading using ODBC Destination. The configuration parameters are provided in the standard data flow destination component dialog. The component requires installation of Teradata Tools and Utilities 13.10. Install the following Teradata components:
- Shared ICU libraries for Teradata
- Teradata GSS client
- ODBC Driver for Teradata
- Teradata Parallel Transporter (Teradata PT)
- Teradata Parallel Transporter API
- Teradata PT Load Operator
- Teradata PT Stream Operator
- Teradata PT Update Operator
Teradata Tools and Utilities post-installation steps
- Define TPT and TPT64 environment variables, referencing the root Teradata Parallel Transporter folder. The default folder is: \Program Files\Teradata\Client\13.10\Teradata Parallel Transporter
- Make sure the following folders are included in your PATH:
- \Program Files\Teradata\Client\13.10\ODBC Driver for Teradata\Lib\
- \Program Files\Teradata\Client\13.10\Shared ICU Libraries for Teradata\lib\
- \Program Files\Teradata\Client\13.10\Teradata Parallel Transporter\bin
- \Program Files\Teradata\Client\13.10\CLIv2
- \Program Files\Teradata\Client\13.10\Teradata Parallel Transporter\msg
Use the parameters below to set up the component.
Select an existing ODBC connection manager. The ODBC connection manager is only used for connecting with the Teradata Parallel Transporter. The ODBC API is not used for the bulk-load process.
Related Topics: ODBC Connection Manager
Select the bulk-load driver. This parameter has the options listed in the following table.
|Load||Use the Load driver. Use this option when the destination table is empty and there are no secondary indexes defined.|
|Update||Use the Update driver. Use this option when the destination table already contains data. The destination is locked during processing.|
|Stream||Use the Stream driver. Use this option when you want to avoid locking of the destination table during processing.|
Specify the destination database table where the data is to be loaded.
Specify the maximum number of errors allowed before the component stops processing. The default is 0.
Indicates whether full security encryption of SQL requests, responses and data is to be used. This parameter has the options listed in the following table.
|True||All SQL requests, responses, and data are encrypted.|
|False||No encryption occurs. This is the default setting.|
Specifies an error table. If not specified, a default error table will be set up with a name based on the selected destination table.
Specifies the minimum number of sessions to be used during processing. The default is 1.
Specifies the maximum number of sessions to be used during processing. The default is 4.
Specifies the name of the external file to use for tracing messages.
Specifies the number of hours that the driver continues trying to log on when the maximum number of load and export operations are already running on the Teradata Database. The default value is four hours.
Specifies the number of minutes that the driver pauses before retrying to log on when the maximum number of load and export operations are already running on the Teradata Database. The default is six minutes.
Indicates whether or not the Stream Operator will use the existing error table.
|True||Stream operator will use the existing error table or create the error table if it does not exist.|
|False||Stream operator will not use the existing error table (default).|
How to modify destination table dynamically at runtime
The destination table is available in the data flow properties list. Follow these steps to set up an expression:
- Right-click on the data flow canvas and select the Properties menu.
- Scroll down and find the property named like [Teradata Destination].[DestinationTable]. This is the property containing the destination table.
- Scroll down and find the Expressions property. Set up an expression to modify the table dynamically.
Case study by Cathay Bank.
- To test the performance of Teradata Destination against the standard ODBC Destination, see this package. In order to use this package, create a TEST_TABLE table with the following columns:
- COL1 INTEGER
- COL2 TIMESTAMP (6)
- COL3 FLOAT
- COL4 VARCHAR (200)
- COL5 VARCHAR (200)
- COL6 DECIMAL (9,5)
- COL7 VARCHAR (16000)