简体   繁体   中英

Unable to initialize an AudioGraph in MainPage.xaml.cs and set frequency with a slider on MainPage.xaml.cs

I am setting up an AudioGraph in App.xaml.cs because if I try to do it on MainPage, the app hangs, never returning an AudioGraph.

Then I want to set a variable frequency that will be controlled by aa slider on MainPage.xaml.cs.

Then when a key 'a' is pressed, the frequency will be played through the audio graph.

To get this to work, I need to start the AudioGraph on MainPage.xaml.cs.

How do I take the AudioGraph that I can only get from App.xaml.cs and put it into an AudioGraph object on MainPage.xaml.cs?

I've tried initializing the AudioGraph on MainPage.xaml.cs, but it never returns and hangs.

I've tried setting the variable frequency on App.xaml.cs and couldn't because the class is sealed.

In fact, both classes are sealed, so I'm not sure how to get the two to communicate variables with each other. Even when I make them public, it won't work.

Here is MainPage.xaml.cs

namespace FG
{
    /// <summary>
    /// An empty page that can be used on its own or navigated to within a Frame.
    /// </summary>
    public sealed partial class MainPage : Page
    {

        // For audio nodes through which audio data flows
        AudioGraph audioGraph;
        // Pushes audio data that is generated
        AudioFrameInputNode frameInputNode;
        double pitch = 1000; // choosing to generate frequency of 1kHz
        public MainPage()
        {
            this.InitializeComponent();
            this.Focus(FocusState.Keyboard);
        }

        public void setAudioGraph(AudioGraph ag)
        {
            audioGraph = ag;
        }

        public void setFrameInputNode(AudioFrameInputNode fin)
        {
            frameInputNode = fin;
        }


        private void Slider_ValueChanged(object sender, RangeBaseValueChangedEventArgs e)
        {
            Slider slider = sender as Slider;
            if (slider != null)
            {
                pitch = slider.Value;
            }
        }

        private void Key_Down(object sender, KeyRoutedEventArgs e)
        {
            if (e.Key == VirtualKey.A)
            {
                frameInputNode.Start();
                audioGraph.Start();
            }
        }

        private void Key_Up(object sender, KeyRoutedEventArgs e)
        {
            frameInputNode.Stop();
            audioGraph.Stop();
        }
    }
}

Here is App.xaml.cs

namespace FG
{
    /// <summary>
    /// Provides application-specific behavior to supplement the default Application class.
    /// </summary>
    sealed partial class App : Application
    {

        // For audio out
        AudioDeviceOutputNode deviceOutputNode;
        // Access to the underlying memory buffer
        [ComImport]
        [Guid("5B0D3235-4DBA-4D44-865E-8F1D0E4FD04D")]
        [InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
        unsafe interface IMemoryBufferByteAccess
        {
            void GetBuffer(out byte* buffer, out uint capacity);
        }

        /// <summary>
        /// Initializes the singleton application object.  This is the first line of authored code
        /// executed, and as such is the logical equivalent of main() or WinMain().
        /// </summary>
        public App()
        {
            this.InitializeComponent();
            // Setup audio pipeline
            InitAudioGraph().Wait();
            CreateDeviceOutputNode().Wait();
            CreateFrameInputNode();
            frameInputNode.AddOutgoingConnection(deviceOutputNode);
            this.Suspending += OnSuspending;
        }

        public async Task InitAudioGraph()
        {
            AudioGraphSettings settings = new AudioGraphSettings(Windows.Media.Render.AudioRenderCategory.Media);
            CreateAudioGraphResult result = await AudioGraph.CreateAsync(settings);
            audioGraph = result.Graph;
        }

        private async Task CreateDeviceOutputNode()
        {
            // Create a device output node
            CreateAudioDeviceOutputNodeResult result = await audioGraph.CreateDeviceOutputNodeAsync();
            deviceOutputNode = result.DeviceOutputNode;
        }

        private void CreateFrameInputNode()
        {
            // Create the FrameInputNode at the same format as the graph, except explicitly set mono.
            AudioEncodingProperties nodeEncodingProperties = audioGraph.EncodingProperties;
            nodeEncodingProperties.ChannelCount = 1;
            frameInputNode = audioGraph.CreateFrameInputNode(nodeEncodingProperties);

            // Initialize the Frame Input Node in the stopped state
            frameInputNode.Stop();

            // Hook up an event handler so we can start generating samples when needed
            // This event is triggered when the node is required to provide data
            frameInputNode.QuantumStarted += node_QuantumStarted;
        }

        private void node_QuantumStarted(AudioFrameInputNode sender, FrameInputNodeQuantumStartedEventArgs args)
        {
            // GenerateAudioData can provide PCM audio data by directly synthesizing it or reading from a file.
            // Need to know how many samples are required. In this case, the node is running at the same rate as the rest of the graph
            // For minimum latency, only provide the required amount of samples. Extra samples will introduce additional latency.
            uint numSamplesNeeded = (uint)args.RequiredSamples;

            if (numSamplesNeeded != 0)
            {
                AudioFrame audioData = GenerateAudioData(numSamplesNeeded);
                frameInputNode.AddFrame(audioData);
            }
        }

        unsafe private AudioFrame GenerateAudioData(uint samples)
        {

            double audioWaveTheta = 0;

            // Buffer size is (number of samples) * (size of each sample)
            // We choose to generate single channel (mono) audio. For multi-channel, multiply by number of channels
            uint bufferSize = samples * sizeof(float);
            AudioFrame frame = new Windows.Media.AudioFrame(bufferSize);

            using (AudioBuffer buffer = frame.LockBuffer(AudioBufferAccessMode.Write))
            using (IMemoryBufferReference reference = buffer.CreateReference())
            {
                byte* dataInBytes;
                uint capacityInBytes;
                float* dataInFloat;

                // Get the buffer from the AudioFrame
                ((IMemoryBufferByteAccess)reference).GetBuffer(out dataInBytes, out capacityInBytes);

                // Cast to float since the data we are generating is float
                dataInFloat = (float*)dataInBytes;

                float amplitude = 0.3f;
                int sampleRate = (int)audioGraph.EncodingProperties.SampleRate;
                double sampleIncrement = (pitch * (Math.PI * 2)) / sampleRate;

                // Generate a 1kHz sine wave and populate the values in the memory buffer
                for (int i = 0; i < samples; i++)
                {
                    double sinValue = amplitude * Math.Sin(audioWaveTheta);
                    dataInFloat[i] = (float)sinValue;
                    audioWaveTheta += sampleIncrement;
                }
            }

            return frame;
        }
    }

I expect to be able to initialize an AudioGraph (only works in App.xaml.cs) and use it to play a frequency that is set with a slider on the MainPage and triggered with a keydown event on the MainPage.

I got it to work with the following code

```
// For generating frequencies
namespace FG
{
    /// <summary>
    /// Provides application-specific behavior to supplement the default Application class.
    /// </summary>
    sealed partial class App : Application
    {
        // For audio nodes through which audio data flows
        static AudioGraph audioGraph;
        // Pushes audio data that is generated
        static AudioFrameInputNode frameInputNode;
        // Pitch in hz
        static float pitch = 1000;
        // For audio out
        AudioDeviceOutputNode deviceOutputNode;
        // Access to the underlying memory buffer
        [ComImport]
        [Guid("5B0D3235-4DBA-4D44-865E-8F1D0E4FD04D")]
        [InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
        unsafe interface IMemoryBufferByteAccess
        {
            void GetBuffer(out byte* buffer, out uint capacity);
        }

        /// <summary>
        /// Initializes the singleton application object.  This is the first line of authored code
        /// executed, and as such is the logical equivalent of main() or WinMain().
        /// </summary>
        public App()
        {
            this.InitializeComponent();
            // Setup audio pipeline
            InitAudioGraph().Wait();
            // For audio playback
            CreateDeviceOutputNode().Wait();
            // of audio frames
            CreateFrameInputNode();
            frameInputNode.AddOutgoingConnection(deviceOutputNode);
            this.Suspending += OnSuspending;
        }

        // Initializes AudioGraph
        public async Task InitAudioGraph()
        {
            AudioGraphSettings settings = new AudioGraphSettings(AudioRenderCategory.Media);
                        CreateAudioGraphResult result = await 
                        AudioGraph.CreateAsync(settings);
            audioGraph = result.Graph;
        }

        // Creates an AudioDeviceOutputNode for sending audio to playback device
        private async Task CreateDeviceOutputNode()
        {
            // Create a device output node
            CreateAudioDeviceOutputNodeResult result = await audioGraph.CreateDeviceOutputNodeAsync();
            deviceOutputNode = result.DeviceOutputNode;
        }

        // Creates FrameInputNode for taking in audio frames
        private void CreateFrameInputNode()
        {
            // Create the FrameInputNode at the same format as the graph, except explicitly set mono.
            AudioEncodingProperties nodeEncodingProperties = audioGraph.EncodingProperties;
            frameInputNode = audioGraph.CreateFrameInputNode(nodeEncodingProperties);
            // Initialize the Frame Input Node in the stopped state
            frameInputNode.Stop();
            // Hook up an event handler so we can start generating samples when needed
            // This event is triggered when the node is required to provide data
            frameInputNode.QuantumStarted += node_QuantumStarted;
        }

        // For creating audio frames on the fly
        private void node_QuantumStarted(AudioFrameInputNode sender, FrameInputNodeQuantumStartedEventArgs args)
        {
            // GenerateAudioData can provide PCM audio data by directly synthesizing it or reading from a file.
            // Need to know how many samples are required. In this case, the node is running at the same rate as the rest of the graph
            // For minimum latency, only provide the required amount of samples. Extra samples will introduce additional latency.
            uint numSamplesNeeded = (uint)args.RequiredSamples;
            if (numSamplesNeeded != 0)
            {
                AudioFrame audioData = GenerateAudioData(numSamplesNeeded);
                frameInputNode.AddFrame(audioData);
            }
        }

        // Generate audioframes for the audiograph
        unsafe private AudioFrame GenerateAudioData(uint samples)
        {
            // Buffer size is (number of samples) * (size of each sample)
            // We choose to generate single channel (mono) audio. For multi-channel, multiply by number of channels
            uint bufferSize = samples * frameInputNode.EncodingProperties.BitsPerSample;
            AudioFrame frame = new Windows.Media.AudioFrame(bufferSize);

            using (AudioBuffer buffer = frame.LockBuffer(AudioBufferAccessMode.Write))
            using (IMemoryBufferReference reference = buffer.CreateReference())
            {
                byte* dataInBytes;
                uint capacityInBytes;
                float* dataInFloat;

                // Get the buffer from the AudioFrame
                ((IMemoryBufferByteAccess)reference).GetBuffer(out dataInBytes, out capacityInBytes);

                // Cast to float since the data we are generating is float
                dataInFloat = (float*)dataInBytes;

                float amplitude = 0.3f;
                int sampleRate = (int)audioGraph.EncodingProperties.SampleRate;
                double sampleIncrement = ((pitch*2*Math.PI)/sampleRate);

                // Generate a 1kHz sine wave and populate the values in the memory buffer
                for (int i = 0; i < samples; i++)
                {
                    double sinValue = amplitude * Math.Sin(i*sampleIncrement);
                    dataInFloat[i] = (float)sinValue;
                }
            }

            return frame;
        }

        /// <summary>
        /// Invoked when the application is launched normally by the end user.  Other entry points
        /// will be used such as when the application is launched to open a specific file.
        /// </summary>
        /// <param name="e">Details about the launch request and process.</param>
        protected override void OnLaunched(LaunchActivatedEventArgs e)
        {
            Frame rootFrame = Window.Current.Content as Frame;

            // Do not repeat app initialization when the Window already has content,
            // just ensure that the window is active
            if (rootFrame == null)
            {
                // Create a Frame to act as the navigation context and navigate to the first page
                rootFrame = new Frame();

                rootFrame.NavigationFailed += OnNavigationFailed;

                if (e.PreviousExecutionState == ApplicationExecutionState.Terminated)
                {
                    //TODO: Load state from previously suspended application
                }

                // Place the frame in the current Window
                Window.Current.Content = rootFrame;
            }

            if (e.PrelaunchActivated == false)
            {
                if (rootFrame.Content == null)
                {
                    // When the navigation stack isn't restored navigate to the first page,
                    // configuring the new page by passing required information as a navigation
                    // parameter
                    rootFrame.Navigate(typeof(MainPage), e.Arguments);
                }
                // Ensure the current window is active
                Window.Current.Activate();
            }
        }

        /// <summary>
        /// Invoked when Navigation to a certain page fails
        /// </summary>
        /// <param name="sender">The Frame which failed navigation</param>
        /// <param name="e">Details about the navigation failure</param>
        void OnNavigationFailed(object sender, NavigationFailedEventArgs e)
        {
            throw new Exception("Failed to load Page " + e.SourcePageType.FullName);
        }

        /// <summary>
        /// Invoked when application execution is being suspended.  Application state is saved
        /// without knowing whether the application will be terminated or resumed with the contents
        /// of memory still intact.
        /// </summary>
        /// <param name="sender">The source of the suspend request.</param>
        /// <param name="e">Details about the suspend request.</param>
        private void OnSuspending(object sender, SuspendingEventArgs e)
        {
            var deferral = e.SuspendingOperation.GetDeferral();
            //TODO: Save application state and stop any background activity
            deferral.Complete();
        }

        internal static void setPitch(float value)
        {
            pitch = value;
        }

        internal static void StartNoise()
        {
            frameInputNode.Start();
            audioGraph.Start();
        }

        internal static void StopNoise()
        {
            frameInputNode.Stop();
            audioGraph.Stop();    
        }
    }
}
```
namespace FG
{
    /// <summary>
    /// An empty page that can be used on its own or navigated to within a Frame.
    /// </summary>
    public sealed partial class MainPage : Page
    {

        public MainPage()
        {
            this.InitializeComponent();
            Window.Current.CoreWindow.KeyDown += CoreWindow_KeyDown;
            Window.Current.CoreWindow.KeyUp += CoreWindow_KeyUp;
        }

        void CoreWindow_KeyDown(Windows.UI.Core.CoreWindow sender, Windows.UI.Core.KeyEventArgs args)
        {
            if (args.VirtualKey == VirtualKey.A)
            {
                App.StartNoise();
            }
        }

        void CoreWindow_KeyUp(Windows.UI.Core.CoreWindow sender, Windows.UI.Core.KeyEventArgs args)
        {
            if (args.VirtualKey == VirtualKey.A)
            {
                App.StopNoise();
            }
        }

        private void PitchTextChanged(object sender, TextChangedEventArgs e)
        {
            try
            {
                App.setPitch(float.Parse(((TextBox)sender).Text));
            }
            catch(FormatException)
            {

            }
        }
    }

Only problem is that the frequency output is wrong...

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM