导入所有表:

sqoop import-all-tables –connect jdbc:mysql://ip:port/dbName --username userName --password passWord  -m 1  --hive-import;

导入一个表:

sqoop import --connect  jdbc:mysql://ip:port/dbName  --table tableName --username userName --password passWord -m 1  --hive-import;

hive中按照mysql中sqoop数据库tb1表创建表tb1

sqoop create-hive-table --connect jdbc:mysql://ip:port/dbName --table tableName --fields-terminated-by ','  --username userName --password passWord

将数据库sqoop中表tb1数据导入到hdfs中,map tasks数量为1

sqoop import --connect jdbc:mysql://ip:port/dbName --username userName --password passWord   --table tableName -m 1

将hdfs中的数据导入到hive表tb1中

load data inpath '/user/code-pc/tb1/part-m-00000' into table tb1;

分段并行导入

sqoop import --append  --connect jdbc:mysql://ip:port/dbName --username userName --password passWord  --target-dir '/user/pg'  --table tableName -m 1 --where "guidesp<='3'"
sqoop import --append  --connect jdbc:mysql://ip:port/dbName --username userName --password passWord  --target-dir '/user/pg'  --table tableName -m 1 --where "guidesp>'3' and guidestep<='5'"

并行导入:

sqoop import  --connect jdbc:mysql://ip:port/dbName --username userName --password passWord  --target-dir '/user/pg1/'  --split-by columnName   --table tableName -m 10
05-14 02:46