简体   繁体   English

fluent-ffmpeg 视频已拉伸图像

[英]fluent-ffmpeg video has stretched image

I have an mp3 audio file and a jpg image file.我有一个 mp3 音频文件和一个 jpg 图像文件。 I want to combine these two files into a new mp4.我想将这两个文件合并成一个新的 mp4。 I have a working fluent-ffmpeg command that does exactly what I want, except somethings an image will be stretched in the final output video.我有一个工作的 fluent-ffmpeg 命令,它完全符合我的要求,除了在最终输出视频中图像会被拉伸。 It seems to happen consistently with jpgs I export from photoshop.它似乎与我从 photoshop 导出的 jpg 一致发生。

Is there any way I can specify in my ffmpeg command to keep the same resolution of my image and not to stretch it?有什么方法可以在我的 ffmpeg 命令中指定以保持图像的相同分辨率而不是拉伸它?

My function below:我的功能如下:

async function debugFunction() {
    console.log('debugFunction()')
    //begin setting up ffmpeg
    const ffmpeg = require('fluent-ffmpeg');
    //Get the paths to the packaged versions of the binaries we want to use
    var ffmpegPath = require('ffmpeg-static-electron').path;
    ffmpegPath = ffmpegPath.replace('app.asar', 'app.asar.unpacked')
    var ffprobePath = require('ffprobe-static-electron').path;
    ffprobePath = ffprobePath.replace('app.asar', 'app.asar.unpacked')
    //tell the ffmpeg package where it can find the needed binaries.
    ffmpeg.setFfmpegPath(ffmpegPath);
    ffmpeg.setFfprobePath(ffprobePath);
    //end setting ffmpeg

    let imgPath = "C:\\Users\\marti\\Documents\\martinradio\\image.jpg";
    let audioPath = "C:\\Users\\marti\\Documents\\martinradio\\audio.mp3";
    let vidPath = "C:\\Users\\marti\\Documents\\martinradio\\video.mp4";

    //create ffmpeg command
    ffmpeg()
    //set rendering options
    .input(imgPath)
    .loop()
    .addInputOption('-framerate 2')
    .input(audioPath)
    .videoCodec('libx264')
    .audioCodec('copy')
    .audioBitrate('320k')
    .videoBitrate('8000k', true)
    .size('1920x1080')
    .outputOptions([
        '-preset medium',
        '-tune stillimage',
        '-crf 18',
        '-pix_fmt yuv420p',
        '-shortest'
    ])
    //set status events
    .on('progress', function (progress) {
        if (progress.percent) {
            console.log(`Rendering: ${progress.percent}% done`)
        }
    })
    .on('codecData', function (data) {
        console.log('codecData=', data);
    })
    .on('end', function () {
        console.log('Video has been converted succesfully');
    })
    .on('error', function (err) {
        console.log('errer rendering video: ' + err.message);
    })
    //run ffmpeg command
    .output(vidPath).run()

}

renders successfully if I give it an audio file and this image:如果我给它一个音频文件和这个图像,渲染成功:

在此处输入图片说明

But the output video looks like this:但输出视频如下所示:

在此处输入图片说明

You can see that the image was squished and stretched out like a rectangle, while I would like to keep it a cube.您可以看到图像像矩形一样被挤压和拉伸,而我想将其保留为立方体。

You seem to be using a 16:9 ratio eg .size('1920x1080') for a picture which is 599X603.对于 599X603 的图片,您似乎使用了 16:9 的比例,例如 .size('1920x1080')。 You can change the size of the output video so that it will match your actual image size (it can be bigger or smaller but keep the similar ratio ) or use 16:9 but add padding on the sides .您可以更改输出视频的大小,使其与您的实际图像大小相匹配(它可以更大或更小,但保持相似的比例)或使用 16:9 但在两侧添加填充

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM