简体   繁体   中英

I wanted to know why the tables from hive db which I imported from sqlserver using sqoop is disappearing

So I'm trying to import-all-tables into hive db, ie, user/hive/warehouse/... on hdfs, using the below command:

sqoop import-all-tables --connect "jdbc:sqlserver://<servername>;database=<dbname>" \
--username "<username>" \
--password "<password>" \
--warehouse-dir "/user/hive/warehouse/" \
--hive-import \
-m 1

In the testdatabase I have 3 tables, when mapreduce runs, the output is success, ie, the mapreduce job is 100% complete but the file is not found on hive db.

It's basically getting overwritten by the last table, try removing the forward slash at the end of the directory path. For the tests I would suggest not to use the warehouse directory, use something like '/tmp/sqoop/allTables'

There is a another way 1. Create a hive database pointing to a location says "targetLocation" 2. Create hcatalog table in your sqoop import using previously created database. 3. Use target-directory import options to point that targetLocation.

you doesn't need need to define warehouse directory.just define hive database it will automatically find out working directory.

sqoop import-all-tables --connect "jdbc:sqlserver://xxx.xxx.x.xxx:xxxx;databaseName=master" --username xxxxxx --password xxxxxxx --hive-import --create-hive-table  --hive-database test -m 1

it will just run like rocket.

hope it work for you....

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM