小编Sar*_*mro的帖子

Apache Beam 中的顺序执行 - Java SDK 2.18.0

嗨,我有几个查询,我想使用 Apache Beam 依次运行和保存结果,我见过一些类似的问题,但找不到答案。我习惯于使用 Airflow 设计管道,而我对 Apache Beam 还是比较陌生。我正在使用 Dataflow 运行程序。这是我现在的代码:我希望 query2 仅在 query1 结果保存到相应表后运行。我如何链接它们?

    PCollection<TableRow> resultsStep1 = getData("Run Query 1",
            "Select * FROM basetable");

    resultsStep1.apply("Save Query1 data",
            BigQueryIO.writeTableRows()
                    .withSchema(BigQueryUtils.toTableSchema(resultsStep1.getSchema()))
                    .to("resultsStep1")
                    .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
                    .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_TRUNCATE)
    );

    PCollection<TableRow> resultsStep2 = getData("Run Query 2",
            "Select * FROM resultsStep1");

    resultsStep2.apply("Save Query2 data",
            BigQueryIO.writeTableRows()
                    .withSchema(BigQueryUtils.toTableSchema(resultsStep2.getSchema()))
                    .to("resultsStep2")
                    .withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
                    .withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_TRUNCATE)
    );
Run Code Online (Sandbox Code Playgroud)

这是我的 getData 函数定义:

private PCollection<TableRow> getData(final String taskName, final String query) {
    return pipeline.apply(taskName,
            BigQueryIO.readTableRowsWithSchema()
                    .fromQuery(query)
                    .usingStandardSql()
                    .withCoder(TableRowJsonCoder.of()));
}
Run Code Online (Sandbox Code Playgroud)

编辑(更新):结果: You can’t sequence the completion of a …

google-cloud-dataflow apache-beam apache-beam-io

1
推荐指数
1
解决办法
753
查看次数