You are not logged in.
I have a Talend job that pulls records from a delimited input file, cleans several columns, and write the cleaned data to a new delimited file doing over 7,300 rows per second.
After the cleaned records are finished writing to the output file, I then start a subjob process to load the records to a Teradata database table. The subjob errors out if I increase the batch size over 250 records. Using a batch size of 250 records and committing every 15000 records, the records are loading at about 780 rows per second.
The error message is just a generic error message:
[Teradata JDBC Driver] [TeraJDBC 13.00.00.16] [Error 1338] [SQLState HY000] A failure occurred while executing a PreparedStatement batch request. Details of the failure can be found in the exception chain that is accessible with getNextException.
Here is what my job looks like:
tFileInputDelimited_1 > tMap_1 > tFileOutputDelimited_1
Subjob: tFileInputDelimited_2 > tMap_2 > tTeradataOutput
1. With the Teradata driver that Talend uses is there a memory usage limitation?
2. What is the max number of rows per second you have heard someone was able to load into Teradata using a Talend job?