concat_ws() 在hive中,被连接对象必须为string或者array<string>,否则报错如下:
hive> select concat_ws(',',unix_timestamp('2012-12-07 13:01:03'),unix_timestamp('2012-12-07 15:01:03'));
FAILED: SemanticException [Error 10016]: Line 1:21 Argument type mismatch ''2012-12-07 13:01:03'':
Argument 2 of function CONCAT_WS must be "string or array<string>", but "bigint" was found.

 

但是在 spark-sql中,concat_ws()中,被连接的对象并不一定为string,也可以是int

(unix_timestamp()返回的是bigint类型)

 

spark-sql> select concat_ws(',',unix_timestamp('2012-12-07 13:01:03'),unix_timestamp('2012-12-07 15:01:03'));

输出结果: 1354856463,1354863663

  

04-16 19:15