我正在处理嵌套的复杂Json,下面是它的架构。

root
 |-- businessEntity: array (nullable = true)
 |    |-- element: struct (containsNull = true)
 |    |    |-- payGroup: array (nullable = true)
 |    |    |    |-- element: struct (containsNull = true)
 |    |    |    |    |-- reportingPeriod: struct (nullable = true)
 |    |    |    |    |    |-- worker: array (nullable = true)
 |    |    |    |    |    |    |-- element: struct (containsNull = true)
 |    |    |    |    |    |    |    |-- category: string (nullable = true)
 |    |    |    |    |    |    |    |-- person: struct (nullable = true)
 |    |    |    |    |    |    |    |-- tax: array (nullable = true)
 |    |    |    |    |    |    |    |    |-- element: struct (containsNull = true)
 |    |    |    |    |    |    |    |    |    |-- code: string (nullable = true)
 |    |    |    |    |    |    |    |    |    |-- qtdAmount: double (nullable = true)
 |    |    |    |    |    |    |    |    |    |-- ytdAmount: double (nullable =

我的要求是创建一个哈希图,其中代码将qtdAmount作为键,将qtdAmount的值作为值。
Map.put(代码+“qtdAmount”,qtdAmount)。我该如何用Spark做到这一点。

我尝试了以下shell命令。
import org.apache.spark.sql._
val sqlcontext = new SQLContext(sc)
val cdm = sqlcontext.read.json("/user/edureka/CDM/cdm.json")
val spark = SparkSession.builder().appName("SQL").config("spark.some.config.option","some-vale").getOrCreate()
cdm.createOrReplaceTempView("CDM")
val sqlDF = spark.sql("SELECT businessEntity[0].payGroup[0] from CDM").show()
val address = spark.sql("SELECT businessEntity[0].payGroup[0].reportingPeriod.worker[0].person.address from CDM as address")
val worker = spark.sql("SELECT businessEntity[0].payGroup[0].reportingPeriod.worker from CDM")
val tax = spark.sql("SELECT businessEntity[0].payGroup[0].reportingPeriod.worker[0].tax from CDM")
val tax = sqlcontext.sql("SELECT businessEntity[0].payGroup[0].reportingPeriod.worker[0].tax from CDM")
tax.select("tax.code")


val codes = tax.select(expode(tax("code"))
scala> val codes = tax.withColumn("code",explode(tax("tax.code"))).withColumn("qtdAmount",explode(tax("tax.qtdAmount"))).withColumn("ytdAmount",explode(tax("tax.ytdAmount")))

我正在尝试将所有代码和qtdAmount放入 map 。但是我不明白。对单个DF使用多个爆炸语句将产生元素的笛卡尔积。

有人可以帮忙如何解析Spark中这么复杂的json。

最佳答案

您可以通过这种方式获得codeqtyAmount

import sqlcontext.implicits._

     cdm.select(
        $"businessEntity.element.payGroup.element.reportingPeriod.worker.element.tax.element.code".as("code"),
        $"businessEntity.element.payGroup.element.reportingPeriod.worker.element.tax.element.qtdAmount".as("qtdAmount")
      ).show

有关详细信息,请检查this

07-28 13:12