我最近将现有管道从 dataflow 1.x 升级到 dataflow 2.x,我看到一个对我来说没有意义的错误。我会将相关代码放在下面,然后包含我看到的错误。
// This is essentially the final step in our pipeline, where we write
// one of the side outputs from the pipeline to a BigQuery table
results.get(matchedTag)
.apply("CountBackfill", Count.<String>perElement())
.apply("ToReportRow", ParDo.of(new ToReportRow()))
// at this point, there is now a PCollection<TableRow>
.apply("WriteReport", BigQueryIO.writeTableRows()
.to(reportingDataset + ".AttributeBackfill_" + dayStr)
.withSchema(ReportSchema.get())
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_TRUNCATE)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED));
/*
* Create a TableRow from a key/value pair
*/
public static class ToReportRow extends DoFn<KV<String, Long>, TableRow> {
private static final long serialVersionUID = 1L;
@ProcessElement
public void processElement(ProcessContext c) throws InterruptedException {
KV<String, Long> row = c.element();
c.output(new TableRow()
.set(ReportSchema.ID, row.getKey())
.set(ReportSchema.COUNT, row.getValue()));
}
}
这是我看到的错误:
.apply("WriteReport", BigQueryIO.writeTableRows()
行是 DUP.java
中的第 106 行,所以我怀疑这行是错误的。关于问题可能是什么的任何想法?
最佳答案
这个问题的解决方案最终出现在 maven 依赖项中。添加以下依赖项并使用 mvn
重新编译后,错误消失了。
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>22.0</version>
</dependency>
关于java - Apache Beam,BigQueryIO.WriteTableRows() 上的 NoSuchMethodError?,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/48067265/