简体   繁体   English

如何录制耳机中播放的android音频

[英]How to record android audio playing in headset

MediaRecorder class in android is used to record audio from mic, can anyone tell me how can we record audio that is actually played on headset. android 中的 MediaRecorder 类用于从麦克风录制音频,谁能告诉我我们如何录制实际在耳机上播放的音频。 Sounds techy but yes it is the thing i am exploring on.听起来很技术,但是是的,这就是我正在探索的东西。 I was told "Visualizer" class can record system audio but as per documentation it can only be used to visualize audio and we cannot put recorder interface there.有人告诉我“Visualizer”类可以记录系统音频,但根据文档,它只能用于可视化音频,我们不能将记录器界面放在那里。

Read more : http://developer.android.com/reference/android/media/audiofx/Visualizer.html阅读更多: http : //developer.android.com/reference/android/media/audiofx/Visualizer.html

Does any from below will serve the purpose ?下面的任何一个都可以达到目的吗?

int CAMCORDER
int DEFAULT
int MIC
int REMOTE_SUBMIX
int VOICE_CALL
int VOICE_COMMUNICAITON
int vOICE_DOWNLINK
int VOICE_RECOGNITION
int VOICE_UPLINK

Has anyone worked on OpenSLES?有人在 OpenSLES 上工作过吗? Heard that too serves the purpose of it听说这也符合它的目的

If there any Android APIs or Third Party APIs you have come across please feel free to share info.如果您遇到任何 Android API 或第三方 API,请随时分享信息。 Few blogs also say this can be done at NDK level.很少有博客说这可以在 NDK 级别完成。 If anyone has worked on it or do have code examples kindly inform如果有人已经研究过它或确实有代码示例,请告知

Thanks谢谢

Example Code to show Michael :显示 Michael 的示例代码:

public class VisualizerView extends View {
  private static final String TAG = "VisualizerView";

  private byte[] mBytes;
  private byte[] mFFTBytes;
  private Rect mRect = new Rect();
  private Visualizer mVisualizer;

  private Set<Renderer> mRenderers;

  private Paint mFlashPaint = new Paint();
  private Paint mFadePaint = new Paint();
  private ByteArrayOutputStream buffer;

  public VisualizerView(Context context, AttributeSet attrs, int defStyle)
  {
    super(context, attrs);
    init();
  }

  public VisualizerView(Context context, AttributeSet attrs)
  {
    this(context, attrs, 0);
  }

  public VisualizerView(Context context)
  {
    this(context, null, 0);
  }

  private void init() {
    mBytes = null;
    mFFTBytes = null;

    mFlashPaint.setColor(Color.argb(122, 255, 255, 255));
    mFadePaint.setColor(Color.argb(238, 255, 255, 255)); // Adjust alpha to change how quickly the image fades
    mFadePaint.setXfermode(new PorterDuffXfermode(Mode.MULTIPLY));

    mRenderers = new HashSet<Renderer>();
  }

  /**
   * Links the visualizer to a player
   * @param player - MediaPlayer instance to link to
   */
  public void link(MediaPlayer player)
  {
    if(player == null)
    {
      throw new NullPointerException("Cannot link to null MediaPlayer");
    }

    // Create the Visualizer object and attach it to our media player.
    mVisualizer = new Visualizer(player.getAudioSessionId());
    mVisualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);

    // Pass through Visualizer data to VisualizerView
    Visualizer.OnDataCaptureListener captureListener = new Visualizer.OnDataCaptureListener()
    {
      @Override
      public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes,
          int samplingRate)
      {
        updateVisualizer(bytes);
        //Record
        if (bytes.length>-1)
        buffer.write(bytes, 0, bytes.length);
        //Record ends
      }

      @Override
      public void onFftDataCapture(Visualizer visualizer, byte[] bytes,
          int samplingRate)
      {
        updateVisualizerFFT(bytes);
      }
    };

    mVisualizer.setDataCaptureListener(captureListener,
        Visualizer.getMaxCaptureRate() / 2, true, true);

    // Enabled Visualizer and disable when we're done with the stream
    mVisualizer.setEnabled(true);
    player.setOnCompletionListener(new MediaPlayer.OnCompletionListener()
    {
      @Override
      public void onCompletion(MediaPlayer mediaPlayer)
      {
        mVisualizer.setEnabled(false);

        //Save File
        try {
            buffer.flush();
        } catch (IOException e) {
            e.printStackTrace();
        }
        mBytes = buffer.toByteArray();
        try {
            buffer.close();
        } catch (IOException e) {
            e.printStackTrace();
        }
        mVisualizer.release();

        File file = new File(Environment.getExternalStorageDirectory(), "music1.wav");
        FileOutputStream fos;

        try {
            fos = new FileOutputStream(file);
            fos.write(mBytes);
            fos.flush();
            fos.close();
        } catch (FileNotFoundException e) {
            // handle exception
        } catch (IOException e) {
            // handle exception
        }
        //Save File ends

      }
    });
  }

  public void addRenderer(Renderer renderer)
  {
    if(renderer != null)
    {
      mRenderers.add(renderer);
    }
  }

  public void clearRenderers()
  {
    mRenderers.clear();
  }

  /**
   * Call to release the resources used by VisualizerView. Like with the
   * MediaPlayer it is good practice to call this method
   */
  public void release()
  {
    mVisualizer.release();
  }

  /**
   * Pass data to the visualizer. Typically this will be obtained from the
   * Android Visualizer.OnDataCaptureListener call back. See
   * {@link Visualizer.OnDataCaptureListener#onWaveFormDataCapture }
   * @param bytes
   */
  public void updateVisualizer(byte[] bytes) {
    mBytes = bytes;
    invalidate();
  }

  /**
   * Pass FFT data to the visualizer. Typically this will be obtained from the
   * Android Visualizer.OnDataCaptureListener call back. See
   * {@link Visualizer.OnDataCaptureListener#onFftDataCapture }
   * @param bytes
   */
  public void updateVisualizerFFT(byte[] bytes) {
    mFFTBytes = bytes;
    invalidate();
  }

  boolean mFlash = false;

  /**
   * Call this to make the visualizer flash. Useful for flashing at the start
   * of a song/loop etc...
   */
  public void flash() {
    mFlash = true;
    invalidate();
  }

  Bitmap mCanvasBitmap;
  Canvas mCanvas;


  @Override
  protected void onDraw(Canvas canvas) {
    super.onDraw(canvas);

    // Create canvas once we're ready to draw
    mRect.set(0, 0, getWidth(), getHeight());

    if(mCanvasBitmap == null)
    {
      mCanvasBitmap = Bitmap.createBitmap(canvas.getWidth(), canvas.getHeight(), Config.ARGB_8888);
    }
    if(mCanvas == null)
    {
      mCanvas = new Canvas(mCanvasBitmap);
    }

    if (mBytes != null) {
      // Render all audio renderers
      AudioData audioData = new AudioData(mBytes);
      for(Renderer r : mRenderers)
      {
        r.render(mCanvas, audioData, mRect);
      }
    }

    if (mFFTBytes != null) {
      // Render all FFT renderers
      FFTData fftData = new FFTData(mFFTBytes);
      for(Renderer r : mRenderers)
      {
        r.render(mCanvas, fftData, mRect);
      }
    }

    // Fade out old contents
    mCanvas.drawPaint(mFadePaint);

    if(mFlash)
    {
      mFlash = false;
      mCanvas.drawPaint(mFlashPaint);
    }

    canvas.drawBitmap(mCanvasBitmap, new Matrix(), null);
  }
}

can anyone tell me how can we record audio that is actually played on headset.谁能告诉我如何录制实际在耳机上播放的音频。

You can't, as there's no official support in the Android APIs to do that.你不能,因为在 Android API 中没有官方支持来做到这一点。 Doesn't matter if you use the Java APIs, or the native APIs included in the NDK.无论您使用 Java API 还是 NDK 中包含的本机 API。
There may be hacks that work on specific devices, if you've got root access, etc, but I'm not going to cover those.如果您有 root 访问权限等,可能会有在特定设备上工作的黑客,但我不打算涵盖这些。 If you're interested you can try searching and see what you can come up with.如果你有兴趣,你可以尝试搜索,看看你能想出什么。


I was told "Visualizer" class can record system audio but as per documentation it can only be used to visualize audio and we cannot put recorder interface there.有人告诉我“Visualizer”类可以记录系统音频,但根据文档,它只能用于可视化音频,我们不能将记录器界面放在那里。

The Visualizer has this method: Visualizer有这个方法:

public int getWaveForm (byte[] waveform)

Returns a waveform capture of currently playing audio content.返回当前播放的音频内容的波形捕获。 The capture consists in a number of consecutive 8-bit (unsigned) mono PCM samples equal to the capture size returned by getCaptureSize() .捕获包含许多连续的 8 位(无符号)单声道 PCM 样本,其数量等于getCaptureSize()返回的捕获大小。

So you can record the currently playing audio using the Visualizer .因此,您可以使用Visualizer录制当前播放的音频。 But as is mentioned in the description above, you'll only get low-quality audio data, because the purpose of this method is to get audio data that you can use for visualization purposes, not for general recording purposes.但是正如上面的描述中提到的,您只会获得低质量的音频数据,因为此方法的目的是获得可用于可视化目的的音频数据,而不是用于一般录音目的。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM