本文介绍了如何从 Google BigQuery 将数据加载到 Google Cloud Bigtable的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要将数据填充到 Google Cloud Bigtable 中,数据源将是 Google BigQuery.

I need to populate data into Google Cloud Bigtable and the source of the data will be Google BigQuery.

作为练习,我可以从 BigQuery 读取数据 作为一个单独的练习,我能够也将数据写入 Bigtable.

As an exercise, I am able to read the data from BigQuery and as an seperate exercise I am able to write data into Bigtable as well.

现在我必须将这 2 个操作合并到一个 Google Cloud Dataflow 作业中.任何示例都会有很大帮助.

Now I have to combine these 2 operations into one Google Cloud Dataflow job. Any example will be of great help.

推荐答案

您可以只使用这些示例中所示的转换,在两者之间添加您需要的任何逻辑,例如:

You can just use the transforms as shown in those examples, adding whatever logic you need in between, for example:

Pipeline p = Pipeline.create(options);
 .apply(BigQueryIO.Read.from("some_table"))
 .apply(ParDo.of(new DoFn<TableRow, Row>() {
   public void processElement(ProcessContext c) {
     Row output = somehowConvertYourDataToARow(c.element());
     c.output(output);
   }
   })
 .apply(BigtableIO.Write.withTableId("some_other_table");

这篇关于如何从 Google BigQuery 将数据加载到 Google Cloud Bigtable的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-29 11:45