You are not logged in.

Unanswered posts



Important! This site has been replaced. All content here is read-only. Please visit our brand-new community at https://community.talend.com/. We look forward to hearing from you there!



#1 2017-04-06 06:38:55

rajaji
Member
5 posts

rajaji said:

Talend Job failing with 4 million records

Hi Experts,
Need your urgent help. We have a simple standard job which is reading data from hive and loading into Hana.
Record count is almost 4 million. 
The job is continuously failing after running for 12 hours.
Error MEssage:
[FATAL]: Hive_to_Hana_LoadtHiveInput_8 java.io.IOException: java.io.IOException: Error reading file: hdfs://**/**/hive/**.db/testtable/ingested_date=2017-02-27/000000_0
We have checked and this file exists
We have 3 components:
Hiveinput ->tmap->Hanaoutput
We have provided "temp Data directory path " of the server where job is getting executed. we have also increased the maximum buffer size to 5 million rows in tmap component.
But the job is still failing.
We have another job which is reading 2 million records but that executed fine, although it also took many hours.
Please suggest what we can do?
Also there is Big Data Batch job , I tired that but there is no component to read from hive and load into hana in this job template.
Any suggestion would be great help!

Offline

Board footer

Talend Contributor Agreement - Talend Website Privacy Policy