简体   繁体   中英

AVAudioSession, AudioStreamBasicDescription and RemoteIO device defaults

I am in the process of trying to write an App that will do digital signal processing and want to make it as light as possible. One thing that confounded me for a while was just what the default values for various devices might be so that I could avoid unwanted conversions taking place before I received the data from the buffers. I came across the following link http://club15cc.com/code-snippets/ios-2/get-the-default-output-stream-format-for-an-audio-unit-in-ios which set me on what I believe to be the right path.

I've extended the code from the link to create and activate an AVAudioSession prior to getting the ASBD(AudioStreamBasicDescription) contents, the AudioSession can then be used to request various "Preferred" settings to see what impacts they have. I also combined the Apple code for listing the values of an ASBD with the code from the above link.

The code below is put into the ViewController.m file generated by selecting the Single View Application template. Note you will need to add the AudioToolbox.framework and CoreAudio.framework to the Linked Frameworks and Libraries of the project.

#import "ViewController.h"
@import AVFoundation;
@import AudioUnit;

@interface ViewController ()

@end

@implementation ViewController

- (void) printASBD:(AudioStreamBasicDescription) asbd {
    char formatIDString[5];
    UInt32 formatID = CFSwapInt32HostToBig (asbd.mFormatID);
    bcopy (&formatID, formatIDString, 4);
    formatIDString[4] = '\0';

    NSLog (@"  Sample Rate:         %10.0f",  asbd.mSampleRate);
    NSLog (@"  Format ID:           %10s",    formatIDString);
    NSLog (@"  Format Flags:        %10X",    (unsigned int)asbd.mFormatFlags);
    NSLog (@"  Bytes per Packet:    %10d",    (unsigned int)asbd.mBytesPerPacket);
    NSLog (@"  Frames per Packet:   %10d",    (unsigned int)asbd.mFramesPerPacket);
    NSLog (@"  Bytes per Frame:     %10d",    (unsigned int)asbd.mBytesPerFrame);
    NSLog (@"  Channels per Frame:  %10d",    (unsigned int)asbd.mChannelsPerFrame);
    NSLog (@"  Bits per Channel:    %10d",    (unsigned int)asbd.mBitsPerChannel);
}

- (void)viewDidLoad
{
    [super viewDidLoad];

    NSError *error = nil;
    AVAudioSession *audioSession = [AVAudioSession sharedInstance];

    // Get a reference to the AudioSession and activate it
    [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
    [audioSession setActive:YES error:&error];


    // Then get RemoteIO AudioUnit and use it to get the content of the default AudioStreamBasicDescription
    AudioUnit remoteIOUnit;

    AudioComponentDescription audioComponentDesc = {0};
    audioComponentDesc.componentType = kAudioUnitType_Output;
    audioComponentDesc.componentSubType = kAudioUnitSubType_RemoteIO;
    audioComponentDesc.componentManufacturer = kAudioUnitManufacturer_Apple;

    // Get component
    AudioComponent audioComponent = AudioComponentFindNext(NULL, &audioComponentDesc);
    AudioComponentInstanceNew(audioComponent, &remoteIOUnit);

    // Read the stream format
    size_t asbdSize = sizeof(AudioStreamBasicDescription);
    AudioStreamBasicDescription asbd = {0};
    AudioUnitGetProperty(remoteIOUnit,
                         kAudioUnitProperty_StreamFormat,
                         kAudioUnitScope_Output,
                         0,
                         (void *)&asbd,
                         &asbdSize);

    [self printASBD:asbd];
}

@end

I would be interested in knowing the results people obtain for other actual hardware. Note the code was built and deployed to IOS 7.1

Format Flags are:

kAudioFormatFlagIsFloat                  = (1 << 0),    // 0x1
kAudioFormatFlagIsBigEndian              = (1 << 1),    // 0x2
kAudioFormatFlagIsSignedInteger          = (1 << 2),    // 0x4
kAudioFormatFlagIsPacked                 = (1 << 3),    // 0x8
kAudioFormatFlagIsAlignedHigh            = (1 << 4),    // 0x10
kAudioFormatFlagIsNonInterleaved         = (1 << 5),    // 0x20
kAudioFormatFlagIsNonMixable             = (1 << 6),    // 0x40
kAudioFormatFlagsAreAllClear             = (1 << 31),

The results I obtained for an iPad 4 are as follows:

Sample Rate:                  0
Format ID:                 lpcm
Format Flags:                29
Bytes per Packet:             4
Frames per Packet:            1
Bytes per Frame:              4
Channels per Frame:           2
Bits per Channel:            32

I guess the lpcm(Linear Pulse Coded Modulation) was no surprise and the Format Flags = x'29' kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked along with 32 bits per channel seem to indicate the expected 8.24 "fixed float".

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM