我已经在使用Spark 1.6.1,现在正在评估Spark 2.0 Preview,但是找不到org.apache.spark.sql.Row。

这是必需的,因为我正在将我的DataFrame代码从1.6.1迁移到2.0预览版。我在这想念什么吗?我的Maven依赖项粘贴在下面

  <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.0.0-preview</version>
         <scope>system</scope>
        <systemPath>C://spark-2.0.0-preview-bin-hadoop2.7//jars//spark-core_2.11-2.0.0-preview.jar</systemPath>
    </dependency>
    <dependency>
        <groupId>com.oracle</groupId>
        <artifactId>ojdbc7</artifactId>
        <version>12.1.0.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.0.0-preview</version>
         <scope>system</scope>
        <systemPath>C://spark-2.0.0-preview-bin-hadoop2.7//jars//spark-sql_2.11-2.0.0-preview.jar</systemPath>
    </dependency>

最佳答案

在spark v2.0.0中,行已移至另一个jar文件,
将此添加到您的Maven依赖

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-catalyst_2.11</artifactId>
    <version>2.0.0-preview</version>
    <scope>system</scope>
    <systemPath>C://spark-2.0.0-preview-bin-hadoop2.7//spark-catalyst_2.11-2.0.0-preview.jar</systemPath>
</dependency>

10-06 03:10