微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

使用formidable和(knox或aws-sdk)将文件流上传到Node.js上的S3

我正在尝试使用 aws-sdkknox将通过表单提交的文件直接上传到Amazon S3存储桶.表单处理在 formidable完成.

我的问题是:如何使用aws-sdk(或knox)使用这些库的最新功能处理流来正确使用强大的功能

我知道这个主题已经在这里被提出了不同的风格,即:

> How to receive an uploaded file using node.js formidable library and save it to Amazon S3 using knox?
> node application stream file upload directly to amazon s3
> Accessing the raw file stream from a node-formidable file upload(以及关于覆盖form.onPart()的非常有用的接受答案)

但是,我认为答案有点过时和/或偏离主题(即CORS支持,我现在不希望出于各种原因使用)和/或最重要的是,没有提及来自的最新功能aws-sdk(参见:https://github.com/aws/aws-sdk-js/issues/13#issuecomment-16085442)或knox(特别是putStream()或其readableStream.pipe(req)变体,both explained in the doc).

经过几个小时的挣扎,我得出结论,我需要一些帮助(免责声明:我是一个流媒体的新手).

HTML表单:

<form action="/uploadPicture" method="post" enctype="multipart/form-data">
  <input name="picture" type="file" accept="image/*">
  <input type="submit">
</form>

Express bodyParser中间件以这种方式配置:

app.use(express.bodyParser({defer: true}))

POST请求处理程序:

uploadPicture = (req,res,next) ->
  form = new formidable.IncomingForm()
  form.parse(req)

  form.onPart = (part) ->
    if not part.filename
      # Let formidable handle all non-file parts (fields)
      form.handlePart(part)
    else
      handlePart(part,form.bytesExpected)

  handlePart = (part,fileSize) ->
    # aws-sdk version
    params =
      Bucket: "mybucket"
      Key: part.filename
      ContentLength: fileSize
      Body: part # passing stream object as body parameter

    awsS3client.putObject(params,(err,data) ->
      if err
        console.log err
      else
        console.log data
    )

但是,我收到以下错误

{ [RequestTimeout: Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.]

message: ‘Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.’,
code: ‘RequestTimeout’,
name: ‘RequestTimeout’,
statusCode: 400,
retryable: false }

以这种方式定制的knox版本的handlePart()函数也惨遭失败:

handlePart = (part,fileSize) ->
  headers =
    "Content-Length": fileSize
    "Content-Type": part.mime
  knoxS3client.putStream(part,part.filename,headers,res) ->
    if err
      console.log err
    else
      console.log res
  )

我还得到一个带有400 statusCode的大型res对象.

在两种情况下,区域都配置为eu-west-1.

补充笔记:

node 0.10.12

latest formidable from npm (1.0.14)

latest aws-sdk from npm (1.3.1)

latest knox from npm (0.8.3)

解决方法

好吧,according to the creator of Formidable,直接流式传输到Amazon S3是不可能的:

The S3 API requires you to provide the size of new files when creating them. This information is not available for multipart/form-data files until they have been fully received. This means streaming is impossible.

实际上,form.bytesExpected指的是整个表单的大小,而不是单个文件的大小.

因此,在上传到S3之前,数据必须首先点击服务器上的内存或磁盘.

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐