简体   繁体   English

缓冲区作为输入,output 用于 fluent-ffmpeg

[英]Buffer as input and output for fluent-ffmpeg

The below looks like a lot but a it's primarily just output.下面看起来很多,但主要只是 output。

I'm trying to take in a buffer using multer (being send the request containing the video (.mov format) through Alamofire from an iPhone) as the input before a fluent-ffmpeg conversion, then I want it to output as a buffer, and then send the result to S3.我试图在 fluent-ffmpeg 转换之前使用 multer(通过 Alamofire 从 iPhone 发送包含视频(.mov 格式)的请求)作为输入,然后我希望它作为缓冲区 output,然后将结果发送到 S3。 I think I'm close, but I don't think fluent-ffmpeg can have a buffer passed in. This is deployed on heroku using this buildpack: https://github.com/jonathanong/heroku-buildpack-ffmpeg-latest... .我想我已经接近了,但我不认为 fluent-ffmpeg 可以传入缓冲区。这是使用此 buildpack 在 heroku 上部署的: https://github.com/jonathanong/heroku-buildpack-ffmpeg-latest。 ... How do I pass it in correctly?如何正确传递?

const multer = require('multer')
const upload = multer({ limits: { fieldSize: 100_000_000 } })
app.post('/create', upload.single('video'), async function (request, response, next) {
  let data = request.body
  console.log(data) // prints [Object: null prototype] { param1: '' }
  let bufferStream = new stream.PassThrough();
  console.log(request.file.buffer) // prints '<Buffer 00 00 00 14 66 74 79 70 71 74 20 20 00 00 00 00 71 74 20 20 00 00 00 08 77 69 64 65 01 0e 28 55 6d 64 61 74 21 20 03 40 68 1c 21 4e ff 3c 59 96 7c 82 ... 17718642 more bytes>'

  new ffmpeg({
    source: stream.Readable.from(request.file.buffer, { objectMode: false })
  })
  // .format('mp4')
  .on('error', function (err) {
    console.log(Error: ${err})
  })
  .on('progress', function (progress) {
    console.log("progress")
  })
  .on('end', function () {
    console.log('Formatting finished!');
    console.log("after");
  })
  .writeToStream(bufferStream);

  // Read the passthrough stream
  const buffers = [];
  bufferStream.on('data', function (buf) {
    buffers.push(buf);
  });
  bufferStream.on('end', function () {
    const outputBuffer = Buffer.concat(buffers);
  // use outputBuffer
  });
  console.log("Added.")
  response.send("Success")
});

The output is what you can see below (without.format('mp4')): output 是您在下面看到的(没有.format('mp4')):

2022-09-03T13:12:24.194384+00:00 heroku[router]: at=info method=POST path="/createBusiness" host=sparrow-testing.herokuapp.com request_id=2774a4ec-e21e-4c2f-8086-460a4cba7d1d fwd="74.71.236.5" dyno=web.1 connect=0ms service=13157ms status=200 bytes=762 protocol=https
2022-09-03T13:12:24.186257+00:00 app[web.1]: [Object: null prototype] { title: '' }
2022-09-03T13:12:24.187296+00:00 app[web.1]: <Buffer 00 00 00 14 66 74 79 70 71 74 20 20 00 00 00 00 71 74 20 20 00 00 00 08 77 69 64 65 01 0e 28 55 6d 64 61 74 21 20 03 40 68 1c 21 4e ff 3c 59 96 7c 82 ... 17718642 more bytes>
2022-09-03T13:12:24.189891+00:00 app[web.1]: Added.
2022-09-03T13:12:24.891564+00:00 app[web.1]: Error: Error: ffmpeg exited with code 1: pipe:1: Invalid argument
2022-09-03T13:12:24.891570+00:00 app[web.1]: 

This output is what you see with.format('mp4'):这个 output 就是你看到的.format('mp4'):

2022-09-03T13:17:07.380415+00:00 app[web.1]: [Object: null prototype] { title: '' }
2022-09-03T13:17:07.381335+00:00 app[web.1]: <Buffer 00 00 00 14 66 74 79 70 71 74 20 20 00 00 00 00 71 74 20 20 00 00 00 08 77 69 64 65 01 0e 28 55 6d 64 61 74 21 20 03 40 68 1c 21 4e ff 3c 59 96 7c 82 ... 17718642 more bytes>
2022-09-03T13:17:07.384047+00:00 app[web.1]: Added.
2022-09-03T13:17:07.388457+00:00 heroku[router]: at=info method=POST path="/createBusiness" host=sparrow-testing.herokuapp.com request_id=84e69ead-09b1-4668-8fc8-b9fc9d5f229d fwd="74.71.236.5" dyno=web.1 connect=0ms service=13079ms status=200 bytes=762 protocol=https
2022-09-03T13:17:08.339746+00:00 app[web.1]: Error: Error: ffmpeg exited with code 1: Conversion failed!
2022-09-03T13:17:08.339783+00:00 app[web.1]: 

My uploadFile function works correctly because I use it elsewhere--normally, I just pass in the request.file.buffer, but here it needs to be a buffer after the ffmpeg conversion我的uploadFile function 工作正常,因为我在其他地方使用它——通常我只是传入request.file.buffer,但这里它需要是ffmpeg 转换后的缓冲区

EDIT:编辑:

At Heiko's suggestion, I tried changing the multer initialization to在 Heiko 的建议下,我尝试将 multer 初始化更改为

multer({ limits: { fieldSize: 100_000_000 }, dest: "uploads/" })

and the source I was passing in to ffmpeg to我传递给 ffmpeg 的源代码

new ffmpeg({
  source: request.file.path // request.file.path seems to be a path of a Multer-generated file, I'd assume the one I'm sending to the server
})
.format('mp4')

but it still errored out to但它仍然出错

Error: ffmpeg exited with code 1: Conversion failed!

when the request.file was:当 request.file 是:

{
  fieldname: 'video',
  originalname: 'video',
  encoding: '7bit',
  mimetype: 'clientMime',
  destination: 'uploads/',
  filename: '08d5d3bbdcf1ac29fb97800136a306e9',
  path: 'uploads/08d5d3bbdcf1ac29fb97800136a306e9',
  size: 1567480
}

I give a try with your code and i get same result.我尝试使用您的代码,得到相同的结果。 After searching a while, i found an explanation.搜索了一会儿,我找到了一个解释。

Well, problem is around MP4 and streams.好吧,问题在于 MP4 和流媒体。

I am not a specialist of video, but it seems that when manipulating MP4 file, ffmpeg need to seek into file ans seeking is incompatible with streams by nature.我不是视频专家,但似乎在处理 MP4 文件时,ffmpeg 需要搜索文件并且搜索本质上与流不兼容。

So you need to output a filesystem file from ffmpeg (make this file temporary by removing it after using it) and then use this file according to your needs.因此,您需要 output 从 ffmpeg 中创建一个文件系统文件(使用后将其删除以使该文件成为临时文件),然后根据您的需要使用该文件。

For having more debug information about ffmpeg error you need to modify your error event listener code like this:要获得有关 ffmpeg 错误的更多调试信息,您需要像这样修改错误事件侦听器代码:

// .format('mp4')
  .on('error', function (err, stdout, stderr) {
    console.log(Error: ${err})
    console.log('Stdout: %o', stdout);
    console.log('Stderr: %o', stderr);
  })
  .on('progress', function (progress) {

And you will see while executing the stderr logs of the ffmpeg process.你会在执行 ffmpeg 进程的 stderr 日志时看到。

You can have confirmation of this here:您可以在此处确认:

https://stackoverflow.com/a/39679975/8807231 https://stackoverflow.com/a/39679975/8807231

https://superuser.com/a/760287 https://superuser.com/a/760287

MP4, MOV and 3GP all contain some important metadata at the end of the file . MP4、MOV 和 3GP在文件末尾都包含一些重要的元数据。 This stems from ages predating streaming and makes these formats by default unfit for it.这源于早于流媒体的时代,并且默认情况下使这些格式不适合它。

The solution is to store that so called moov atom at the beginning of the file.解决方案是将所谓的moov atom存储在文件的开头 This means you must preprocess iPhone videos in order to stream them.这意味着您必须预处理 iPhone 视频才能对它们进行 stream。

With ffmpeg eg via FFmpegKit the option is -movflags faststart .使用 ffmpeg 例如通过FFmpegKit选项-movflags faststart
With AVAssetWriter/AVAssetExportSession it's the flag shouldOptimizeForNetworkUse ( via ).使用 AVAssetWriter/AVAssetExportSession 它是标志shouldOptimizeForNetworkUse ( via )。
These need to write to a file on device though, maybe you can work around that with a named pipe read by Alamofire.这些需要写入设备上的文件,也许您可以使用 Alamofire 读取的命名 pipe 来解决这个问题。

VLC and iOS separately transfer the moov atom to make sense of the rest. VLC 和 iOS 分别传输moov 原子以理解 rest。 If you could apply this strategy to ffmpeg you'd only need to find the moov atom on the phone.如果您可以将此策略应用于 ffmpeg,您只需要在手机上找到 moov atom。 But there were no examples to find of ffmpeg setup this way.但是没有找到 ffmpeg 以这种方式设置的示例。

Your server code seems fine, probably only needs the -movflags faststart as output option there, too.您的服务器代码看起来不错,可能也只需要-movflags faststart as output 选项。 For testing upload an MKV or WEBM file which ffmpeg can read streamed out of the box.为了测试上传一个 MKV 或 WEBM 文件,ffmpeg 可以读取开箱即用的流。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM