简体   繁体   English

AVAudioSession,AudioStreamBasicDescription和RemoteIO设备默认设置

[英]AVAudioSession, AudioStreamBasicDescription and RemoteIO device defaults

I am in the process of trying to write an App that will do digital signal processing and want to make it as light as possible. 我正在尝试编写一个将进行数字信号处理的应用程序,并希望使其尽可能轻巧。 One thing that confounded me for a while was just what the default values for various devices might be so that I could avoid unwanted conversions taking place before I received the data from the buffers. 令我困惑的一件事是各种设备的默认值可能是什么,这样我就可以避免在从缓冲区接收数据之前发生不必要的转换。 I came across the following link http://club15cc.com/code-snippets/ios-2/get-the-default-output-stream-format-for-an-audio-unit-in-ios which set me on what I believe to be the right path. 我遇到了以下链接http://club15cc.com/code-snippets/ios-2/get-the-default-output-stream-format-for-an-audio-unit-in-ios ,这使我了解了什么我相信这是正确的道路。

I've extended the code from the link to create and activate an AVAudioSession prior to getting the ASBD(AudioStreamBasicDescription) contents, the AudioSession can then be used to request various "Preferred" settings to see what impacts they have. 在获取ASBD(AudioStreamBasicDescription)内容之前,我已经扩展了链接中的代码,以创建和激活AVAudioSession,然后可以使用AudioSession请求各种“首选”设置,以查看它们的影响。 I also combined the Apple code for listing the values of an ASBD with the code from the above link. 我还结合了用于列出ASBD值的Apple代码和上述链接中的代码。

The code below is put into the ViewController.m file generated by selecting the Single View Application template. 下面的代码放入通过选择“单一视图应用程序”模板生成的ViewController.m文件中。 Note you will need to add the AudioToolbox.framework and CoreAudio.framework to the Linked Frameworks and Libraries of the project. 请注意,您需要将AudioToolbox.framework和CoreAudio.framework添加到项目的链接框架和库中。

#import "ViewController.h"
@import AVFoundation;
@import AudioUnit;

@interface ViewController ()

@end

@implementation ViewController

- (void) printASBD:(AudioStreamBasicDescription) asbd {
    char formatIDString[5];
    UInt32 formatID = CFSwapInt32HostToBig (asbd.mFormatID);
    bcopy (&formatID, formatIDString, 4);
    formatIDString[4] = '\0';

    NSLog (@"  Sample Rate:         %10.0f",  asbd.mSampleRate);
    NSLog (@"  Format ID:           %10s",    formatIDString);
    NSLog (@"  Format Flags:        %10X",    (unsigned int)asbd.mFormatFlags);
    NSLog (@"  Bytes per Packet:    %10d",    (unsigned int)asbd.mBytesPerPacket);
    NSLog (@"  Frames per Packet:   %10d",    (unsigned int)asbd.mFramesPerPacket);
    NSLog (@"  Bytes per Frame:     %10d",    (unsigned int)asbd.mBytesPerFrame);
    NSLog (@"  Channels per Frame:  %10d",    (unsigned int)asbd.mChannelsPerFrame);
    NSLog (@"  Bits per Channel:    %10d",    (unsigned int)asbd.mBitsPerChannel);
}

- (void)viewDidLoad
{
    [super viewDidLoad];

    NSError *error = nil;
    AVAudioSession *audioSession = [AVAudioSession sharedInstance];

    // Get a reference to the AudioSession and activate it
    [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
    [audioSession setActive:YES error:&error];


    // Then get RemoteIO AudioUnit and use it to get the content of the default AudioStreamBasicDescription
    AudioUnit remoteIOUnit;

    AudioComponentDescription audioComponentDesc = {0};
    audioComponentDesc.componentType = kAudioUnitType_Output;
    audioComponentDesc.componentSubType = kAudioUnitSubType_RemoteIO;
    audioComponentDesc.componentManufacturer = kAudioUnitManufacturer_Apple;

    // Get component
    AudioComponent audioComponent = AudioComponentFindNext(NULL, &audioComponentDesc);
    AudioComponentInstanceNew(audioComponent, &remoteIOUnit);

    // Read the stream format
    size_t asbdSize = sizeof(AudioStreamBasicDescription);
    AudioStreamBasicDescription asbd = {0};
    AudioUnitGetProperty(remoteIOUnit,
                         kAudioUnitProperty_StreamFormat,
                         kAudioUnitScope_Output,
                         0,
                         (void *)&asbd,
                         &asbdSize);

    [self printASBD:asbd];
}

@end

I would be interested in knowing the results people obtain for other actual hardware. 我想知道人们在其他实际硬件上获得的结果。 Note the code was built and deployed to IOS 7.1 请注意,该代码已构建并部署到IOS 7.1

Format Flags are: 格式标志是:

kAudioFormatFlagIsFloat                  = (1 << 0),    // 0x1
kAudioFormatFlagIsBigEndian              = (1 << 1),    // 0x2
kAudioFormatFlagIsSignedInteger          = (1 << 2),    // 0x4
kAudioFormatFlagIsPacked                 = (1 << 3),    // 0x8
kAudioFormatFlagIsAlignedHigh            = (1 << 4),    // 0x10
kAudioFormatFlagIsNonInterleaved         = (1 << 5),    // 0x20
kAudioFormatFlagIsNonMixable             = (1 << 6),    // 0x40
kAudioFormatFlagsAreAllClear             = (1 << 31),

The results I obtained for an iPad 4 are as follows: 我为iPad 4获得的结果如下:

Sample Rate:                  0
Format ID:                 lpcm
Format Flags:                29
Bytes per Packet:             4
Frames per Packet:            1
Bytes per Frame:              4
Channels per Frame:           2
Bits per Channel:            32

I guess the lpcm(Linear Pulse Coded Modulation) was no surprise and the Format Flags = x'29' kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked 我猜lpcm(线性脉冲编码调制)就不足为奇了,格式标志= kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked along with 32 bits per channel seem to indicate the expected 8.24 "fixed float". kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked以及每个通道32位似乎表示预期的8.24“固定浮点数”。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 AVAudioSession 无法识别来自蓝牙设备的音频 - AVAudioSession does not recognise audio from bluetooth device “尝试AVAudioSession.sharedInstance()。setCategory”仅在设备上返回nil - “try AVAudioSession.sharedInstance().setCategory” returns nil on only device AudioStreamBasicDescription 中是否需要 mBytesPerFrame? - Is mBytesPerFrame needed in AudioStreamBasicDescription? 用户默认值将数据保存在模拟器中,但不保存在设备上 - User Defaults saves data in simulator but not on device WAV的AudioStreamBasicDescription设置值 - AudioStreamBasicDescription setting values for wav AVAudioSession错误激活:错误域= NSOSStatusErrorDomain代码= 561017449:将CallKit API集成到Objective C时出现音频设备错误 - AVAudioSession error activating: Error Domain=NSOSStatusErrorDomain Code=561017449: Audio device error on integrating CallKit API to Objective C 混音器AudioUnit到RemoteIO AudioUnit - Mixer AudioUnit to RemoteIO AudioUnit 转换Swift UnsafePointer <AudioStreamBasicDescription> 到字典? - Converting a Swift UnsafePointer<AudioStreamBasicDescription> to a Dictionary? CAStreamBasicDescription和AudioStreamBasicDescription有什么区别? - What is the difference between CAStreamBasicDescription and AudioStreamBasicDescription? 如何在coreaudio中为AudioStreamBasicDescription指定比特率? - How to specify bitrate for an AudioStreamBasicDescription in coreaudio?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM