Need your urgent help. We have a simple standard job which is reading data from hive and loading into Hana.
Record count is almost 4 million.
The job is continuously failing after running for 12 hours.
[FATAL]: Hive_to_Hana_LoadtHiveInput_8 java.io.IOException: java.io.IOException: Error reading file: hdfs://**/**/hive/**.db/testtable/ingested_date=2017-02-27/000000_0
We have checked and this file exists
We have 3 components:
We have provided "temp Data directory path " of the server where job is getting executed. we have also increased the maximum buffer size to 5 million rows in tmap component.
But the job is still failing.
We have another job which is reading 2 million records but that executed fine, although it also took many hours.
Please suggest what we can do?
Also there is Big Data Batch job , I tired that but there is no component to read from hive and load into hana in this job template.
Any suggestion would be great help!