如何解决Airflow 实例没有通过 SFTPsensor 连接到边缘节点服务器SSH 连接类型
我的目标是让 Airflow DAG 检查文件是否存在于不同服务器(在本例中为集群中的边缘节点)内的目录中。
我的第一个方法是制作一个 SSHOperator,它会触发一个 bash 脚本(在边缘节点服务器中)来检查目录是否为空。这奏效了。我能够从 DAG 日志中的 bash 脚本接收输出,告诉我目录是否为空。但是,当 SSHOperator 失败(即脚本在目录中没有找到文件)时,当前的 dag 运行会被中断并开始新的 dag 运行。如果这种情况发生多次(这是预期的),我最终会在树视图中出现大量中断的 dag 运行 =/
因此,我的第二种方法是使用合适的传感器。在这种情况下,SFTPSensor 似乎是最好的选择。
这是我的 python DAG 代码:
from airflow import DAG
from datetime import timedelta,datetime
from airflow.utils.dates import days_ago
from airflow.models import Variable
import requests
import logging
import time
from airflow.contrib.sensors.sftp_sensor import SFTPSensor
from airflow.operators.python_operator import PythonOperator
def say_bye(**context):
print("byebyeeee!")
default_args = {
'owner': 'airflow',"start_date": days_ago(1),}
ssh_id = Variable.get("ssh_connection_id_imb")
source_path = "/trf/cq/millennium/rcp/"
dag = DAG(dag_id='ing_cgd_millennium_t_ukajrnl_imb_test4',default_args=default_args,schedule_interval=None)
with dag:
s0 = SFTPSensor(
task_id='sensing_task',path=source_path,fs_conn_id=ssh_id,poke_interval=60,mode='reschedule',retries=1
)
t1 = PythonOperator(task_id='run_this_goodbye',python_callable=say_bye,provide_context=True)
s0 >> t1
我的 SSH 连接 (ssh_connection_id_imb) 如下所示:https://i.stack.imgur.com/x7iLu.png
和错误:
[2021-03-09 11:56:07,662] {base_hook.py:89} INFO - Using connection to: id: sftp_default. Host: localhost,Port: 22,Schema: None,Login: airflow,Password: None,extra: XXXXXXXX
[2021-03-09 11:56:07,664] {base_hook.py:89} INFO - Using connection to: id: sftp_default. Host: localhost,665] {sftp_sensor.py:46} INFO - Poking for lpc600.group.com:/trf/cq/millenium/rcp/C.PGMLNGL.FKM001.041212.20201123.gz
[2021-03-09 11:56:07,665] {logging_mixin.py:112} WARNING - /opt/miniconda/lib/python3.7/site-packages/pysftp/__init__.py:61: UserWarning: Failed to load HostKeys from /root/.ssh/known_hosts. You will need to explicitly load HostKeys (cnopts.hostkeys.load(filename)) or disableHostKey checking (cnopts.hostkeys = None).
warnings.warn(wmsg,UserWarning)
[2021-03-09 11:56:07,666] {taskinstance.py:1150} ERROR - Unable to connect to localhost: [Errno 101] Network is unreachable
Traceback (most recent call last):
File "/opt/miniconda/lib/python3.7/site-packages/airflow/models/taskinstance.py",line 984,in _run_raw_task
result = task_copy.execute(context=context)
File "/opt/miniconda/lib/python3.7/site-packages/airflow/sensors/base_sensor_operator.py",line 107,in execute
while not self.poke(context):
File "/opt/miniconda/lib/python3.7/site-packages/airflow/contrib/sensors/sftp_sensor.py",line 48,in poke
self.hook.get_mod_time(self.path)
File "/opt/miniconda/lib/python3.7/site-packages/airflow/contrib/hooks/sftp_hook.py",line 219,in get_mod_time
conn = self.get_conn()
File "/opt/miniconda/lib/python3.7/site-packages/airflow/contrib/hooks/sftp_hook.py",line 114,in get_conn
self.conn = pysftp.Connection(**conn_params)
File "/opt/miniconda/lib/python3.7/site-packages/pysftp/__init__.py",line 140,in __init__
self._start_transport(host,port)
File "/opt/miniconda/lib/python3.7/site-packages/pysftp/__init__.py",line 176,in _start_transport
self._transport = paramiko.Transport((host,port))
File "/opt/miniconda/lib/python3.7/site-packages/paramiko/transport.py",line 416,in __init__
"Unable to connect to {}: {}".format(hostname,reason)
paramiko.ssh_exception.SSHException: Unable to connect to localhost: [Errno 101] Network is unreachable
我注意到 base_hook 指向 localhost 而 sftp_sensor 指向正确的服务器....我需要设置基本钩子吗??我是不是少了一步??谢谢您的帮助! =)
解决方法
刚刚意识到我的错误...
问题 #1 错误的 sftp_connection 名称:
s0 = SFTPSensor(
task_id='sensing_task',path=source_path,sftp_conn_id=ssh_id,# instead of fs_conn_id
poke_interval=60,mode='reschedule',retries=1
)
问题#2 Extra 字段需要在连接中定义
我创建了一个公钥并将其添加到 Extra 字段中:
{"key_file": "/airflow/generated_sshkey_dir/id_rsa.pub","no_host_key_check": true}
Sooo,这使我的连接容易受到中间人攻击,因为我没有检查主机密钥。就我而言,这个解决方案就足够了。
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。