微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

如何启动具有包依赖项的 AWS 云形成堆栈?

如何解决如何启动具有包依赖项的 AWS 云形成堆栈?

我正在努力让这个 repo 运行起来:https://github.com/mydatastack/google-analytics-to-s3

A link is provided to launch the AWS CloudFormation stack 但它不再工作,因为包含模板的 S3 存储桶不再处于活动状态。

因此,我试图通过 sam deploy --guided 自己启动堆栈。这开始构建堆栈,但中途失败并出现以下错误

C:\Users\Me\GAS3\cloudformation>sam deploy --guided

Configuring SAM deploy
======================

        Looking for config file [samconfig.toml] :  Found
        Reading default arguments  :  Success

        Setting default arguments for 'sam deploy'
        =========================================
        Stack Name [GA_2_S3]:
        AWS Region [eu-central-1]:
        Parameter Name [pipes]:
        Parameter Stage [local]:
        Parameter Adminemail [info@project.com]:
        Parameter FallbackEmail [info@project.com]:
        Parameter S3AlarmPeriod [60]:
        #Shows you resources changes to be deployed and require a 'Y' to initiate deploy
        Confirm changes before deploy [Y/n]: y
        #SAM needs permission to be able to create roles to connect to the resources in your template
        Allow SAM CLI IAM role creation [Y/n]: y
        Save arguments to configuration file [Y/n]: y
        SAM configuration file [samconfig.toml]:
        SAM configuration environment [default]:

        Looking for resources needed for deployment: Found!

                Managed S3 bucket: aws-sam-cli-managed-default-samclisourcebucket-1vcjy21utm1w6
                A different default S3 bucket can be set in samconfig.toml

        Saved arguments to config file
        Running 'sam deploy' for future deployments will use the parameters saved above.
        The above parameters can be changed by modifying samconfig.toml
        Learn more about samconfig.toml Syntax at
        https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-config.html


File with same data already exists at GA_2_S3/d5396e95465bde0f60dbd769db9fe763,skipping upload
File with same data already exists at GA_2_S3/df3bbd85d54385405a650fc656f1ac19,skipping upload
File with same data already exists at GA_2_S3/2c01865beec56ebee30ae5b24e6f50e3,skipping upload
File with same data already exists at GA_2_S3/4adb166d233b6e3a1badf491522b0bcc,skipping upload
Error: Unable to upload artifact ./collector-ga.yaml referenced by Location parameter of GoogleAnalyticsCollectorStack resource.
Unable to upload artifact ../functions/lambda-layers/paramiko/ referenced by ContentUri parameter of SFTPLayer resource.
Parameter ContentUri of resource SFTPLayer refers to a file or folder that does not exist C:\Users\Me\GAS3\functions\lambda-layers\paramiko

检查文件夹,GitHub 上没有 ./lambda-layers/ 文件夹或 paramiko 包。我已经尝试从 GitHub 下载 paramiko 包,然后创建引用的 /functions/lambda-layers/paramiko/,但这没有用。

看看 ./collector-ga.yaml,这是失败的部分:

  SFTPLayer:
    Condition: SFTPUploadActivate
    Type: AWS::Serverless::LayerVersion
    Properties:
      LayerName: paramiko
      Description: paramkio lib for sftp connect
      ContentUri: ../functions/lambda-layers/paramiko/
      CompatibleRuntimes:
        - python3.7
      LicenseInfo: MIT should be added here 
      RetentionPolicy: Retain

为 paramiko 提供的 ContentUri 位置不在 GitHub 上,因此必须以其他方式构建它,因为原始存储库旨在通过单击按钮启动工作堆栈。

我的问题是:如何使用所需的 paramiko 包启动此堆栈?

解决方法

我的问题是:如何使用所需的 paramiko 包启动此堆栈?

您需要适当的部署管道。如果您使用的是 AWS,则可以使用 AWS CodePipeline 部署 lambda 函数。由于您有构建依赖项,因此您需要一个构建阶段来实际获取和构建 lambda 部署包或层所需的所有内容。

或者您可以使用来自某些预构建或公共的 paraminco,例如 this one。通过这种方式,您可以将代码与依赖项分离。

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。