简体   繁体   English

iOS:Airplay的音频和视频点按

[英]iOS: Audio and Video tap from Airplay

I have made a video player that is analyzing the realtime audio and video tracks from the video that is currently playing. 我制作了一个视频播放器,用于分析当前正在播放的视频中的实时音频和视频轨道。 The videos are stored on the iOS device (in the Apps Documents directory). 视频存储在iOS设备上(在Apps Documents目录中)。

This all works fine. 一切正常。 I use MTAudioProcessingTap in order to get all the audio samples and do some FFT, and I am analyzing the video by just copy'ing the pixel buffers from the currently played CMTime (the AVPlayer currentTime property). 我使用MTAudioProcessingTap来获取所有音频样本并进行一些FFT,我只是通过从当前播放的CMTime(AVPlayer currentTime属性)中复制像素缓冲区来分析视频。 As I said, this works fine. 正如我所说,这很好。

But now I want to support Airplay. 但现在我想支持Airplay。 Just the airplay itself is not difficult, but my taps stop working as soon as Airplay is toggled and the video is playing on the ATV. 只是airplay本身并不困难,但是一旦Airplay切换并且视频正在ATV播放,我的水龙头就会停止工作。 Somehow, the MTAudioProcessingTap won't process and the pixelbuffers are all empty... I can't get to the data. 不知何故,MTAudioProcessingTap将不会处理,像素缓冲区全部为空......我无法获取数据。

Is there any way to get to this data ? 有没有办法获得这些数据?

In order to get the pixel buffers, I just fire an event every few milli-sec and retrieving the player's currentTime. 为了获得像素缓冲区,我只需每隔几毫秒触发一个事件并检索播放器的currentTime。 Then: 然后:

CVPixelBufferRef imageBuffer = [videoOutput copyPixelBufferForItemTime:time itemTimeForDisplay:nil];
CVPixelBufferLockBaseAddress(imageBuffer,0);

uint8_t *tempAddress = (uint8_t *) CVPixelBufferGetBaseAddress(imageBuffer);

size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);

CVPixelBufferUnlockBaseAddress(imageBuffer,0);

Where tempAddress is my pixelbuffer, and videoOutput is an instance of AVPlayerItemVideoOutput . 其中tempAddress是我的pixelbuffer,而videoOutputAVPlayerItemVideoOutput一个实例。

For audio, I use: 对于音频,我使用:

AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];

// Create a processing tap for the input parameters
MTAudioProcessingTapCallbacks callbacks;

callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;

MTAudioProcessingTapRef tap;
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
                                          kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err || !tap) {
    NSLog(@"Unable to create the Audio Processing Tap");
    return;
}

inputParams.audioTapProcessor = tap;

// Create a new AVAudioMix and assign it to our AVPlayerItem
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = @[inputParams];
playerItem.audioMix = audioMix;

Regards, Niek 此致,Niek

Unfortunately, in my experience, it's not possible to get information about the audio/video during Airplay as the playback is being done on the Apple TV so the iOS device doesn't have any of the information. 不幸的是,根据我的经验,在Airplay期间无法获取有关音频/视频的信息,因为Apple TV上正在播放,因此iOS设备没有任何信息。

I had the same issue with getting SMPTE subtitle data out of the timedMetaData , which stops getting reported during Airplay. timedMetaData获取SMPTE字幕数据时遇到了同样的问题,该数据在Airplay期间停止报告。

Here the solution: 这里的解决方案:

this is to implement AirPlay, i use this code only for Audio on my app i don't know if you can improve for the video but you can try ;) 这是为了实现AirPlay,我只在我的应用程序上使用此代码我不知道你是否可以改进视频,但你可以尝试;)

On AppDelegate.m : AppDelegate.m上

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {

    [RADStyle applyStyle];
    [radiosound superclass];
    [self downloadZip];

    NSError *sessionError = nil;
    [[AVAudioSession sharedInstance] setDelegate:self];
    [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:&sessionError];
    [[AVAudioSession sharedInstance] setActive:YES error:nil];

    UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
    AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);

    UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
    AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);

    [[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
}

An if you use airplay in nice to implement the LockScreen control, ArtWork, Stop/play, Title ecc. 如果你使用airplay来实现LockScreen控件,ArtWork,停止/播放,标题ecc。

In the DetailViewController of you player use this code: 在你玩家的DetailViewController中使用以下代码:

- (BOOL)canBecomeFirstResponder {

    return YES;
}
- (void)viewDidAppear:(BOOL)animated {

    [super viewDidAppear:animated];
    [[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
    [self becomeFirstResponder];

    NSData* imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString: (self.saved)[@"image"]]];

    if (imageData == nil){

    MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
    MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:@"lockScreen.png"]];

    infoCenter.nowPlayingInfo = @{MPMediaItemPropertyTitle: saved[@"web"],MPMediaItemPropertyArtist: saved[@"title"], MPMediaItemPropertyArtwork:albumArt};

    } else {

        MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
        MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageWithData:imageData]];

        infoCenter.nowPlayingInfo = @{MPMediaItemPropertyTitle: saved[@"link"],MPMediaItemPropertyArtist: saved[@"title"], MPMediaItemPropertyArtwork:albumArt};

    }

}

Hope this code can help you ;) 希望这段代码可以帮到你;)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM