本文介绍了如何动态将日期传递到shell脚本以执行sqoop命令?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用以下命令进行sqoop导入:

Im working on a sqoop import with the following command:

#!/bin/bash
    while IFS=":" read -r server dbname table; do
    sqoop eval --connect jdbc:mysql://$server/$dbname --username root --password cloudera --table mydata --hive-import --hive-table dynpart --check-column id --last-value $(hive -e "select max(id) from dynpart"); --hive-partition-key 'thisday' --hive-partition-value '01-01-2016'
done<tables.txt

我每天都在做分区.蜂巢表:

Im doing the partition for everyday.Hive table:

create table dynpart(id int, name char(30), city char(30))
  partitioned by(thisday char(10))
  row format delimited
  fields terminated by ','
  stored as textfile
  location '/hive/mytables'
  tblproperties("comment"="partition column: thisday structure is dd-mm-yyyy");

但是我不想直接提供分区值,因为我想创建一个sqoop作业并每天运行它.在脚本中,如何动态地将日期值传递给sqoop命令(格式:dd/mm/yyyy),而不是直接给它?感谢您的帮助.

But I don't want to give the partition value directly as I want to create a sqoop job and run it everyday. In the script, how can I pass the date value to sqoop command dynamically (format: dd/mm/yyyy) instead of giving it directly ?Any help is appreciated.

推荐答案

您可以尝试以下代码

#!/bin/bash
DATE=$(date +"%d-%m-%y")
while IFS=":" read -r server dbname table; do
sqoop eval --connect jdbc:mysql://$server/$dbname --username root --password cloudera --table mydata --hive-import --hive-table dynpart --check-column id --last-value $(hive -e "select max(id) from dynpart"); --hive-partition-key 'thisday' --hive-partition-value $DATE
done<tables.txt

希望有帮助

这篇关于如何动态将日期传递到shell脚本以执行sqoop命令?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-15 21:30