簡體   English   中英

如何在初始化內將參數從Objective C傳遞到Swift

[英]how to pass parameters from Objective C to Swift within init

在Apple示例代碼iOS Metronome中, https://developer.apple.com/library/content/samplecode/HelloMetronome/Introduction/Intro.html#//apple_ref/doc/uid/TP40017587

現在,Apple硬編碼self.setTempo(120)在以下代碼的末尾帶有120。

override init() {
    super.init()
    // Use two triangle waves which are generate for the metronome bips.

    // Create a standard audio format deinterleaved float.
    let format = AVAudioFormat(standardFormatWithSampleRate: 44100.0, channels: 2)

    // How many audio frames?
    let bipFrames: UInt32 = UInt32(GlobalConstants.kBipDurationSeconds * Float(format.sampleRate))

    // Create the PCM buffers.
    soundBuffer.append(AVAudioPCMBuffer(pcmFormat: format, frameCapacity: bipFrames))
    soundBuffer.append(AVAudioPCMBuffer(pcmFormat: format, frameCapacity: bipFrames))

    // Fill in the number of valid sample frames in the buffers (required).
    soundBuffer[0]?.frameLength = bipFrames
    soundBuffer[1]?.frameLength = bipFrames

    // Generate the metronme bips, first buffer will be A440 and the second buffer Middle C.
    let wg1 = TriangleWaveGenerator(sampleRate: Float(format.sampleRate))                     // A 440
    let wg2 = TriangleWaveGenerator(sampleRate: Float(format.sampleRate), frequency: 261.6)   // Middle C
    wg1.render(soundBuffer[0]!)
    wg2.render(soundBuffer[1]!)

    // Connect player -> output, with the format of the buffers we're playing.
    let output: AVAudioOutputNode = engine.outputNode

    engine.attach(player)
    engine.connect(player, to: output, fromBus: 0, toBus: 0, format: format)

    bufferSampleRate = format.sampleRate

    // Create a serial dispatch queue for synchronizing callbacks.
    syncQueue = DispatchQueue(label: "Metronome")

    self.setTempo(120)
}

如何將參數(而不是用120硬編碼)從Objective C的以下代碼傳遞到init內的上述Swift代碼:

- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.

NSLog(@"Hello, Metronome!\n");

NSError *error = nil;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];

[audioSession setCategory:AVAudioSessionCategoryAmbient error:&error];
if (error) {
    NSLog(@"AVAudioSession error %ld, %@", error.code, error.localizedDescription);
}

[audioSession setActive:YES error:&error];
if (error) {
    NSLog(@"AVAudioSession error %ld, %@", error.code, error.localizedDescription);
}

// if media services are reset, we need to rebuild our audio chain
[[NSNotificationCenter defaultCenter] addObserver:self
                                         selector:@selector(handleMediaServicesWereReset:)
                                             name:AVAudioSessionMediaServicesWereResetNotification
                                           object:audioSession];

metronome = [[Metronome alloc] init];
metronome.delegate = self;

}

非常感謝!

要將參數添加到Swift方法,請更改

override init() {
...
self.setTempo(120)

init(frequency: Int) {
...
self.setTempo(frequency)

這將允許您將Objective-C初始化稱為

[[Metronome alloc] initWithFrequency: (your frequency)];

關於您的聲音問題,目前尚不清楚發生什么情況,而又沒有更多關於您要執行的操作的上下文,但是我會嘗試將您的初始化代碼從viewDidLoad移到viewDidAppear。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM