繁体   English   中英

在 java 中使用 FFT 对 a.wav 文件生成频谱图未产生预期的 output

[英]Spectrogram generation in java using FFT on a .wav file not producing expected output

所以我正在制作一个人工智能项目,将语音分类为“上”、“下”、“左”、右或背景噪音,并由此移动视频游戏中的角色。

I have made an FFT algorithm deriving it from the mathematical explanation, which I believe is correct as I have tested its output against that from this site ( https://engineering.icalculator.info/discrete-fourier-transform-calculator.html )

然后,我尝试生成一个频谱图,并使用了基于来自该站点的应用程序 class 的主 function 代码的代码( 在 Z93F725A07423FE1C889FZD448B46 中使用 FFT 创建频谱图 from.wav

我在我打招呼的 a.wav 文件上测试了我的代码,生成的频谱图不是我所期望的,请参见下面我的 java 制作的频谱图和我的 python 制作的频谱图之间的区别(忽略色差)。

Java 频谱图

Python 频谱图

全新 Java 频谱图,带 SleuthEyes 帮助

这是我使用/编写的原始代码:

package STACKOVERFLOW;

import com.company.Complex;

import javax.imageio.ImageIO;
import java.awt.*;
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.Arrays;
import java.util.Scanner;

public class StackOverFlow {
    private static Color getColour(double power) {
        var H = power * 0.4;
        var S = 1.0;
        var B = 1.0;
        return Color.getHSBColor((float) H, (float) S, (float) B);
    }

    private static double[] getAudioData(String filePath) {
        var path = Paths.get(filePath);
        try {
            var entireFileData = Files.readAllBytes(path);
            var rawData = Arrays.copyOfRange(entireFileData, 44, entireFileData.length);
            var length = rawData.length;

            var newLength = length / 4;
            var dataMono = new double[newLength];

            double left, right;
            for (int i = 0; 2 * i + 3< newLength; i++) {
                left = (short) ((rawData[2 * i + 1] & 0xff) << 8) | (rawData[2 * i] & 0xff);
                right = (short) ((rawData[2 * i + 3] & 0xff) << 8) | (rawData[2 * i + 2] & 0xff);
                dataMono[i] = (left + right) / 2.0;
            }

            return dataMono;
        } catch (IOException e) {
            e.printStackTrace();
        }
        return null;
    }

    private static Complex[] toComplex(double[] samples) {
        var l = samples.length;
        var cOut = new Complex[l];
        for (int i = 0; i < l; i++) {
            cOut[i] = new Complex(samples[i], 0);
        }
        return cOut;
    }

    private static double modulusSquared(Complex a) {
        var real = a.getReal();
        var imaginary = a.getImag();
        return (real * real) + (imaginary * imaginary);
    }

    private static Complex[] fft(Complex[] samples) {
        var N = samples.length; // number of samples
        if (N == 1) return samples; // stops the recursive splits on the samples
        // TODO: M only works for N a power of 2
        var M = N / 2; // middle index of the samples
        var Xeven = new Complex[M]; // array for even split
        var Xodd = new Complex[M]; // array for odd split

        // splits the samples
        for (int i = 0; i < M; i++) {
            Xeven[i] = samples[2 * i];
            Xodd[i] = samples[2 * i + 1];
        }

        // recursive calls on even and odd samples
        var Feven = new Complex[M];
        Feven = fft(Xeven);
        var Fodd = new Complex[M];
        Fodd = fft(Xodd);

        var frequencyBins = new Complex[N];

        for (int i = 0; i < (N / 2); i++) {
            var cExponential = Complex.multiply(
                    Complex.polar(1, -2 * Math.PI * i / N),
                    Fodd[i]
            );

            frequencyBins[i] = Complex.add(
                    Feven[i],
                    cExponential
            );

            frequencyBins[i + N / 2] = Complex.sub(
                    Feven[i],
                    cExponential
            );
        }
        return frequencyBins;
    }

    public static void makeSpectrogram() {
        var scan = new Scanner(System.in);
        System.out.println("Enter file path: ");
        var filePath = scan.nextLine();
        var rawAudioData = getAudioData(filePath);
        assert rawAudioData != null;
        var length = rawAudioData.length;
        var complexAudioData = toComplex(rawAudioData);

        // parameters for FFT
        var windowSize = 256;
        var overlapFactor = 2;
        var windowStep = windowSize / overlapFactor;

        // plotData array
        var nX = (length - windowSize) / windowStep;
        var nY = (windowSize / 2);
        var plotData = new double[nX][nY];

        // amplitudes to normalise
        var maxAmplitude = Double.MIN_VALUE;
        var minAmplitude = Double.MAX_VALUE;
        double amplitudeSquared;

        // application of the FFT
        for (int i = 0; i < nX; i++) {
            var windowSizeArray = fft(Arrays.copyOfRange(complexAudioData, i * windowStep, i * windowStep + windowSize));
            for (int j = 0; j < nY; j++) {
                amplitudeSquared = modulusSquared(windowSizeArray[2 * j]);
                if (amplitudeSquared == 0.0) {
                    plotData[i][nY - j - 1] = amplitudeSquared;
                } else {
                    var threshold = 1.0; // prevents log(0)
                    plotData[i][nY - j - 1] = 10 * Math.log10(Math.max(amplitudeSquared, threshold));
                }

                // find min and max amplitudes
                if (plotData[i][j] > maxAmplitude) {
                    maxAmplitude = plotData[i][j];
                } else if (plotData[i][j] < minAmplitude) {
                    minAmplitude = plotData[i][j];
                }
            }
        }

        // normalisation
        var difference = maxAmplitude - minAmplitude;
        for (int i = 0; i < nX; i++) {
            for (int j = 0; j < nY; j++) {
                plotData[i][j] = (plotData[i][j] - minAmplitude) / difference;
            }
        }

        // plot the spectrogram
        var spectrogram = new BufferedImage(nX, nY, BufferedImage.TYPE_INT_RGB);
        double ratio;
        for (int i = 0; i < nX; i++) {
            for (int j = 0; j < nY; j++) {
                ratio = plotData[i][j];
                var colour = getColour(1.0 - ratio);
                spectrogram.setRGB(i, j, colour.getRGB());
            }
        }

        // write the image to a file
        try {
            var outputFile = new File("saved.png");
            ImageIO.write(spectrogram, "png", outputFile);
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    public static void main(String[] args) {
        makeSpectrogram();
    }
}

这是上面使用的复杂 class:

package com.company;

import java.text.DecimalFormat;

public class Complex {

    private final static DecimalFormat df2 = new DecimalFormat("#.##");

    private double r;
    private double i;

    public Complex(double r, double i) {
        this.r = r;
        this.i = i;
    }

    @Override
    public String toString() {
        return "(" + df2.format(this.r) + ", " + df2.format(this.i) + "i) ";
    }

    public double abs() {
        return Math.hypot(this.r, this.i);
    }

    public double getReal() {
        return this.r;
    }

    public double getImag() {
        return this.i;
    }

    public void setReal(double r) {
        this.r = r;
    }

    public void setImag(double i) {
        this.i = i;
    }

    public static Complex polar(double r, double theta) {
        return new Complex(
                r * Math.cos(theta),
                r * Math.sin(theta)
        );
    }

    public static Complex multiply(Complex a, Complex b) {
            /*
             (a + bi) * (c + di) =
             ac + adi + cbi + -bd =
             (ac - bd) + (ad + cb)i
            */
        var real = (a.r * b.r) - (a.i * b.i);
        var imag = (a.r * b.i) + (a.i * b.r);
        return new Complex(real, imag);
    }

    public static Complex add(Complex a, Complex b) {
        return new Complex(
                a.r + b.r,
                a.i + b.i
        );
    }

    public static Complex sub(Complex a, Complex b) {
        return new Complex(
                a.r - b.r,
                a.i - b.i
        );
    }
}

任何指导将不胜感激

读取.wav 文件

您链接的其他问题中包含的 .wav 文件解码几乎不是完整的解码器。 它说明了 OP 的特定立体声 2bytes-per-sample 用例。

看起来您在尝试使其适应不同的用例时偶然发现了其他解码问题。 作为一般建议,我建议使用更完整的.wav 解码器,它会考虑通道数、每个样本的字节数等。

另一方面,如果您想制作自己的解码器(例如作为学习练习),那么稍微更健壮的实现可能如下所示:

public short getShort(byte[] buffer, int offset) {
  return (short) ((buffer[offset + 1] & 0xff) << 8) | (buffer[offset] & 0xff);
}
public int getNumberOfChannels(byte[] entireFileData){
  return (int) getShort(entireFileData, 22);
}
public int getBytesPerSample(byte[] entireFileData){
  return (int) getShort(entireFileData, 34)/8;
}

private static double[] getAudioData(String filePath) {

    ...
    var entireFileData = Files.readAllBytes(path);
    var rawData = Arrays.copyOfRange(entireFileData, 44, entireFileData.length);
    var length = rawData.length;

    int numChannels    = getNumberOfChannels(entireFileData);
    int bytesPerSample = getBytesPerSample(entireFileData);
    int newLength      = length / (bytesPerSample*numChannels);
    var dataMono       = new double[newLength];
    if (2 == bytesPerSample) {
      for (int i = 0; 2*numChannels*(i+1)-1 < length; i++) {
        double sum = 0.0;
        for (int j = 0; j < numChannels; j++) {
          sample = (short) ((rawData[2*numChannels*i + 2*j + 1] & 0xff) << 8) | (rawData[2*numChannels*i + 2*j] & 0xff);
          sum += sample;
        }
        dataMono[i] = sum / numChannels;
      }
    }
    else { 
    ... // handle different number of bytes per sample
    }
}

请注意,它仍然只涵盖 16 位 PCM 样本,假设固定的 header 结构(请参阅本教程,但.wav 文件格式实际上更灵活),并且会在带有扩展块的文件上绊倒。

处理光谱

您链接的其他问题中使用的 FFT 库返回一个double数组,该数组将被解释为实际复数值的交错实部和虚部。 相应地,用于执行幅度计算的索引使用索引2*j2*j+1处的元素对。 另一方面,您的实现直接获取复数值,因此您不应跳过具有2*因子的值,而应使用:

for (int j = 0; j < nY; j++) {
  amplitudeSquared = modulusSquared(windowSizeArray[j]);
  ...
}

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM