简体   繁体   中英

Use Sqoop to import data from mysql to Hadoop but fail

I tried to import data through Sqoop using the following command.

sqoop import -connect jdbc:mysql://localhost/test_sqoop --username root --table test

but I got the connection refuse error.

And I found out I can't connect to mysql and got this error:

Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock'

And I also found out if I don't execute start-dfs.sh , mysql.sock exists in /var/lib/mysql/mysql.sock .

mysql

After I executed start-dfs.sh , mysql.sock would be gone and I can't connect to mysql.

start-dfs.sh

Below is /etc/my.cnf configuration.

datadir=/var/lib/mysql
socket=/var/lib/mysql/mysql.sock

  • jdbc string should be: jdbc:mysql://localhost:3306/test_sqoop , best practice is to use server name intesad of localhost or 127.0.0.1 . you can get the server name from this command hostname -f . so jdbc string should be jdbc:mysql://servername:3306/test_sqoop - replace the server name by out put of hostname -f command.
  • you need -P or --password or --connection-param-file to pass the password to the sqoop command. sqoop doesn't read from .my.cnf file. - see usage here

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM