You are not logged in.

Unanswered posts



Important! This site has been replaced. All content here is read-only. Please visit our brand-new community at https://community.talend.com/. We look forward to hearing from you there!



#1 2015-04-13 07:07:46

deepatalend
Member
7 posts

deepatalend said:

sqoop import

Tags: [error]

Hi ,
I am using talend-bigdata version5.6 and trying to import data from mysql to hadoop using sqoop .
I am getting the below error
[WARN ]: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies.
I have uploaded screen shot also with this.
Please help me .
Thanks

Offline

#2 2015-04-14 04:08:12

xdshi
Talend Team


xdshi said:

Re: sqoop import

Hi,

Thank you for your post! We can't see the screenshot on our side. Could you check it, please?
Make sure your screenshot is not bigger than 2MB and screenshots works only if you drag&drop the image directly in the editor window.

Best regards
Sabrina


What we can do is to make sure that Talend will be your best choice!

Offline

#3 2015-04-15 07:19:06

deepatalend
Member
7 posts

deepatalend said:

Re: sqoop import

Hi PFA screen shot 

In case if  screen shot is not available,I have copied down error as well , 

at project1.hg_0_1.hg.main(hg.java:999)
[INFO ]: org.apache.sqoop.Sqoop - Running Sqoop version: 1.4.4-cdh5.0.3
[WARN ]: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
[INFO ]: org.apache.sqoop.manager.MySQLManager - Preparing to use a MySQL streaming resultset.
[INFO ]: org.apache.sqoop.tool.CodeGenTool - Beginning code generation
[INFO ]: org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `cities` AS t LIMIT 1
[INFO ]: org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `cities` AS t LIMIT 1
[INFO ]: org.apache.sqoop.orm.CompilationManager - $HADOOP_MAPRED_HOME is not set
Note: \tmp\sqoop-307425\compile\282dc771b9ac8f2a1793016ce1cc564a\cities.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
[INFO ]: org.apache.sqoop.orm.CompilationManager - Writing jar file: \tmp\sqoop-307425\compile\282dc771b9ac8f2a1793016ce1cc564a\cities.jar
[INFO ]: org.apache.sqoop.manager.DirectMySQLManager - Beginning mysqldump fast path import
[INFO ]: org.apache.sqoop.mapreduce.ImportJobBase - Beginning import of cities
[WARN ]: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies.
[ERROR]: org.apache.sqoop.tool.ImportTool - Imported Failed: java.net.UnknownHostException: aster2

Exception in component tSqoopImport_1

Last edited by deepatalend (2015-04-15 10:30:14)

Offline

#4 2015-04-15 10:55:47

xdshi
Talend Team


xdshi said:

Re: sqoop import

Hi,

Would you mind uploading your tSqoopImport component setting screenshot into forum?

Best regards
Sabrina


What we can do is to make sure that Talend will be your best choice!

Offline

#5 2015-04-15 11:09:53

deepatalend
Member
7 posts

deepatalend said:

Re: sqoop import

Hi  Please find the screen shot.
I have dragged and dropped it in the editor window. but I am not sure if my screen shot is uploaded.


Regards,
Deepa

Last edited by deepatalend (2015-04-15 11:12:50)

Offline

#6 2015-04-15 11:20:27

xdshi
Talend Team


xdshi said:

Re: sqoop import

Hi,

It seems that your screenshot is missing. Make sure your screenshot is not bigger than 2MB.

Best regards
Sabrina


What we can do is to make sure that Talend will be your best choice!

Offline

#7 2015-07-09 18:23:00

DaisukeSaito
Member
3 posts

DaisukeSaito said:

Re: sqoop import

Hi, I'm facing similar issues for sqoop import.
I'm trying Step_2_SQOOP_MYSQL_TO_HDFS_Imports on TOS-BD-5.6.1.20141207_1530 on MACOSX
And have hadoop sandbox (HDP2.2) and MySQL in different IP address.
Step_1 of the Demo job have done successfully so looks Mysql side configured properly. Then gotten error in Step 2 when connecting to hadoop side.
I set up hadoop distribution for tSqoopImport into "custom" using add-on download from talend exchange
Hortonworks Data Platform 2.2.X

 
The error was like this:

[statistics] connecting to socket on port 4062
[statistics] connected
[INFO ]: org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2015-07-09 09:30:34.238 java[70978:5503] Unable to load realm info from SCDynamicStore
[WARN ]: org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[INFO ]: org.apache.sqoop.Sqoop - Running Sqoop version: 1.4.5.2.2.4.2-2
[WARN ]: org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
[INFO ]: org.apache.sqoop.manager.SqlManager - Using default fetchSize of 1000
[INFO ]: org.apache.sqoop.tool.CodeGenTool - Beginning code generation
[INFO ]: org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `point` AS t LIMIT 1
[INFO ]: org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `point` AS t LIMIT 1
[INFO ]: org.apache.sqoop.orm.CompilationManager - $HADOOP_MAPRED_HOME is not set
Note: /tmp/sqoop-XXX/compile/XXXX/table_sample.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
[INFO ]: org.apache.sqoop.orm.CompilationManager - Writing jar file: /tmp/sqoop-XXXXXXX/compile/XXXXXXX/table_sample.jar
[WARN ]: org.apache.sqoop.manager.MySQLManager - It looks like you are importing from mysql.
[WARN ]: org.apache.sqoop.manager.MySQLManager - This transfer can be faster! Use the --direct
[WARN ]: org.apache.sqoop.manager.MySQLManager - option to exercise a MySQL-specific fast path.
[INFO ]: org.apache.sqoop.manager.MySQLManager - Setting zero DATETIME behavior to convertToNull (mysql)
[WARN ]: org.apache.sqoop.manager.CatalogQueryManager - The table table_sample contains a multi-column primary key. Sqoop will default to the column x01_navi_id only for this job.
[WARN ]: org.apache.sqoop.manager.CatalogQueryManager - The table table_sample a multi-column primary key. Sqoop will default to the column column_a only for this job.
[INFO ]: org.apache.sqoop.mapreduce.ImportJobBase - Beginning import of point
[INFO ]: org.apache.hadoop.conf.Configuration.deprecation - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
[INFO ]: org.apache.hadoop.conf.Configuration.deprecation - mapred.jar is deprecated. Instead, use mapreduce.job.jar
[INFO ]: org.apache.hadoop.conf.Configuration.deprecation - mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
[WARN ]: org.apache.sqoop.mapreduce.JobBase - SQOOP_HOME is unset. May not be able to find all job dependencies.
[INFO ]: org.apache.hadoop.conf.Configuration.deprecation - session.id is deprecated. Instead, use dfs.metrics.session-id
[INFO ]: org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=
[INFO ]: org.apache.hadoop.mapreduce.JobSubmitter - Cleaning up the staging area file:/tmp/hadoop-XXXX/mapred/staging/hdfs2104262709/.staging/job_local2104262709_0001
Exception in component tSqoopImport_4
java.lang.Exception: The Sqoop import job has failed. Please check the logs.
	at bigdatademo.step_2_sqoop_mysql_to_hdfs_import_simple_0_1.Step_2_SQOOP_MYSQL_TO_HDFS_Import_simple.tSqoopImport_4Process(Step_2_SQOOP_MYSQL_TO_HDFS_Import_simple.java:526)
	at bigdatademo.step_2_sqoop_mysql_to_hdfs_import_simple_0_1.Step_2_SQOOP_MYSQL_TO_HDFS_Import_simple$2.run(Step_2_SQOOP_MYSQL_TO_HDFS_Import_simple.java:912)
[ERROR]: org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.FileNotFoundException: File does not exist: hdfs://namenode_ip:8020/Applications/TOS_BD-20141207_1530-V5.6.1/workspace/.Java/lib/mysql-connector-java.jar
	at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122)
	at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
	at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)

Error seems the job trying to read JDBC driver (mysql-connector-java.jar) from hdfs with inappropriate path.
(Says my local machine (mac)'s local path like /Application/... ). How can I fix this?
please find the screenshot (Sorry for the locale, I'm using JP):

Best,
Daisuke

mini_blob_20150709-1928.png

Last edited by DaisukeSaito (2015-07-09 18:28:55)

Offline

#8 2017-01-07 07:10:46

rishit
Member
7 posts

rishit said:

Re: sqoop import

Hi @deepatalend,

Was your issue solved? i am facing same problem?
can you please let me know what was done?

thanks,
RIshit Shah

Offline

#9 2017-03-11 06:56:02

sandeep rathore
Guest

sandeep rathore said:

Re: sqoop import

Hi I am using talend-bigdata version 6.1.2 and trying to import data from sql server to hadoop using tsqoopimport.
getting error like : - Note: \tmp\sqoop-307425\compile\282dc771b9ac8f2a1793016ce1cc564a\cities.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

Board footer

Talend Contributor Agreement - Talend Website Privacy Policy