简体   繁体   English

在Swift中开发一个Audiounit

[英]Develop an Audiounit in Swift

I have an application that should also work as an AudioUnit plugin.我有一个应用程序也应该作为 AudioUnit 插件工作。 In Xcode, I go to File > New > Target, select AudioUnit and make sure that the selected language is "Swift".在 Xcode,我 go 到文件 > 新建 > 目标,select AudioUnit 并确保选择的语言是“Swift”。 In the generated code, though, I have that the actual code of the plugin is within "h", "m" Objective C Files:不过,在生成的代码中,我知道插件的实际代码在 "h"、"m" 目标 C 文件中:

#import "ChordezAuAudioUnit.h"

#import <AVFoundation/AVFoundation.h>

// Define parameter addresses.
const AudioUnitParameterID myParam1 = 0;

@interface ChordezAuAudioUnit ()

@property (nonatomic, readwrite) AUParameterTree *parameterTree;
@property AUAudioUnitBusArray *inputBusArray;
@property AUAudioUnitBusArray *outputBusArray;
@end


@implementation ChordezAuAudioUnit
@synthesize parameterTree = _parameterTree;

- (instancetype)initWithComponentDescription:(AudioComponentDescription)componentDescription

[...]

How do I develop the plugin in Swift?如何在 Swift 中开发插件? In this Github project, the author seems to be doing it, but I don't know how to replace the generated code with the Swift one: https://github.com/inquisitiveSoft/MIDIAudioUnitExample/blob/main/TransposeOctaveAudioUnit/MIDIAudioUnit/MIDIAudioUnit.swift In this Github project, the author seems to be doing it, but I don't know how to replace the generated code with the Swift one: https://github.com/inquisitiveSoft/MIDIAudioUnitExample/blob/main/TransposeOctaveAudioUnit/MIDIAudioUnit/ MIDIAudioUnit.swift

In a2017 WWDC session on Core Audio, Apple specifically recommended against using Swift inside the real-time audio context, due to a small probability that memory allocation or other locks might occur in the Swift runtime. In a2017 WWDC session on Core Audio, Apple specifically recommended against using Swift inside the real-time audio context, due to a small probability that memory allocation or other locks might occur in the Swift runtime. AFAIK, that recommendation has not been recinded by Apple (yet?). AFAIK,该建议尚未被 Apple 撤销(还没有?)。

So, if you want your Audio Unit plug-in to be reliable, the answer is to NOT develop that portion of your plug-in in Swift.因此,如果您希望您的音频单元插件可靠,答案是不要在 Swift 中开发插件的那部分。 Stick to the C subset of Objective C (no object messaging or instance variables) for any critical real-time code.对于任何关键的实时代码,坚持目标 C 的 C 子集(无 object 消息传递或实例变量)。

No Swift "inside the real-time audio context".没有 Swift“在实时音频上下文中”。

Absolutely.绝对地。 You will commonly see C and Objective-C++ in the render block.您通常会在渲染块中看到 C 和 Objective-C++。

You can use Swift in the rest of the audio unit and its UI.可以在音频单元及其 UI 的 rest 中使用 Swift。 You can even use SwiftUI too.你甚至可以使用 SwiftUI。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM