微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

GRPC 错误 Docker Mac - 使用 Python、Beam 和 Flink 进行 Kafka 流处理

如何解决GRPC 错误 Docker Mac - 使用 Python、Beam 和 Flink 进行 Kafka 流处理

更新:我启动了一个 EC2 实例,并且能够使下面的示例正常工作,这证实这是 Mac 上 Docker 的连接问题。

更新:即使我关闭了 Flink Server Container 和 Kafka,我仍然面临这个错误,这导致我相信这是一个连接问题

我最近尝试使用教程 tutorial 使用 Python、Apache Beam 和 Apache Flink 处理 Kafka Stream。 根据教程,我使用以下命令设置 Flink

docker run --net=host apache/beam_flink1.13_job_server:latest

这样做会导致以下结果:

Jul 14,2021 8:40:47 PM org.apache.beam.runners.jobsubmission.JobServerDriver createArtifactStagingService
INFO: ArtifactStagingService started on localhost:8098
Jul 14,2021 8:40:47 PM org.apache.beam.runners.jobsubmission.JobServerDriver createExpansionService
INFO: Java ExpansionService started on localhost:8097
Jul 14,2021 8:40:47 PM org.apache.beam.runners.jobsubmission.JobServerDriver createJobServer
INFO: JobService started on localhost:8099
Jul 14,2021 8:40:47 PM org.apache.beam.runners.jobsubmission.JobServerDriver run
INFO: Job server Now running,terminate with Ctrl+C

使用 python main.py(如下所示)运行我的脚本时,出现以下错误

grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
        status = StatusCode.UNAVAILABLE
        details = "Failed to connect to all addresses"
        debug_error_string = "{"created":"@1626301362.091496000","description":"Failed to pick subchannel","file":"src/core/ext/filters/client_channel/client_channel.cc","file_line":3009,"referenced_errors":[{"created":"@1626301362.091494000","description":"Failed to connect to all addresses","file":"src/core/ext/filters/client_channel/lb_policy/pick_first/pick_first.cc","file_line":398,"grpc_status":14}]}"

有没有人知道一个快速解决方法?我应该注意到我发现了 this

ma​​in.py

import apache_beam as beam
from apache_beam.io.kafka import ReadFromKafka
from apache_beam.options.pipeline_options import PipelineOptions

if __name__ == '__main__':
    options = PipelineOptions([
        "--runner=PortableRunner","--job_endpoint=localhost:8099","--environment_type=LOOPBACK",])

    pipeline = beam.Pipeline(options=options)

    result = (
        pipeline
        | "Read from kafka" >> ReadFromKafka(
            consumer_config={
                "bootstrap.servers": 'localhost:9092',},topics=['demo'],expansion_service='localhost:8097',)

        | beam.Map(print)
    )

    pipeline.run()

解决方法

--net=host 在 Mac 版 Docker 桌面上不受支持

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。