数据流中的“无根单位”错误,从 Golang 中的 PubSub 到 Bigquery

Ege*_*rar 3 go google-bigquery google-cloud-pubsub google-cloud-dataflow

我正在尝试从 PubSub 读取消息,然后写入 DataFlow 中的 BigQuery 表。但是,我通过使用直接运行器遇到了“无根单位”错误。

这是我的代码;

package main

import (
    "context"
    "encoding/json"
    "flag"
    "fmt"

    "github.com/apache/beam/sdks/v2/go/pkg/beam/io/bigqueryio"
    "github.com/apache/beam/sdks/v2/go/pkg/beam/x/debug"

    "github.com/apache/beam/sdks/v2/go/pkg/beam"
    "github.com/apache/beam/sdks/v2/go/pkg/beam/io/pubsubio"
    "github.com/apache/beam/sdks/v2/go/pkg/beam/log"
    "github.com/apache/beam/sdks/v2/go/pkg/beam/x/beamx"
)


type DummyBody struct {
        TaskId string `json:"id" bigquery:"id"`
    }


func buildPipeline(s beam.Scope) {
    rawDummyBodies := pubsubio.Read(s, "project", "topic", &pubsubio.ReadOptions{Subscription: "sub.ID"})

    dummyBodies := beam.ParDo(s, func(ctx context.Context, data []byte) (DummyBody, error) {
        var body DummyBody
        if err := json.Unmarshal(data, &body); err != nil {
            log.Error(ctx, err)
            fmt.Println("Error")
            return body, err
        }
        fmt.Println("No Error")
        return body, nil
    }, rawDummyBodies)

    debug.Printf(s, "Task : %#v", dummyBodies)

    bigqueryio.Write(s, "project", "table", dummyBodies)
}

func main() {
    flag.Parse()
    beam.Init()

    p, s := beam.NewPipelineWithRoot()
    buildPipeline(s)

    ctx := context.Background()
    if err := beamx.Run(ctx, p); err != nil {
        log.Exitf(ctx, "Failed to execute pipeline: %v", err)
    }
}
Run Code Online (Sandbox Code Playgroud)

管道开始使用直接运行器执行,但由于没有根单元而失败。

2022/11/01 14:29:55 无法执行管道:翻译失败导致:没有根单元退出状态 1

小智 6

pubsubio 的当前实现仅适用于 Dataflow Runner