Forum Xamarin.iOS
We are excited to announce that the Xamarin Forums are moving to the new Microsoft Q&A experience. Q&A is the home for technical questions and answers at across all products at Microsoft now including Xamarin!

We encourage you to head over to Microsoft Q&A for .NET for posting new questions and get involved today.

How to access microphone stream in Mac?

xoailxoail Member ✭✭

Hi, I am trying to port my Win Forms app to Mac and have a question regarding microphone access.

In my WinForms app, I use NAudio to capture the microphone stream and run through some speech recognition in realtime.

But unfortunately I am unable to import the NAudio Nuget in my Xamarin.Mac app. I believe it is incompatible.

So my question is, how can I go around accessing the microphone stream on a Mac? What APIs exists. My research mostly resulted in iOS or Android related APIs but not much on Mac.

Any pointers will be greatly appreciated.


  • seanydaseanyda GBMember ✭✭✭✭✭

    You will want to be using these classes from the AVFoundation namespace.

    AVAudioPlayer player;
    AVAudioRecorder recorder;
  • xoailxoail Member ✭✭

    Thanks for the reply. Is there any example I be referred to? I looked up AVAudioRecorder, I looks like it uses AVAudioSession (which doesn't exists in MacOS?) and also saves as audio file.
    I am looking to capture the stream and send it to Google Speech API (gRPC).

  • xoailxoail Member ✭✭

    Can anyone advise if I am on the right track? Being new to Mac world.. trying to understand how I can pass microphone stream to Google.
    Here's what I have so far. When I do this, I see that delegate does not get called at all:

    using System;
    using AppKit;
    using Foundation;
    using AVFoundation;
    using CoreGraphics;
    using CoreMedia;
    namespace SpeechCommands
        public partial class ViewController : NSViewController, IAVCaptureAudioDataOutputSampleBufferDelegate
            public ViewController(IntPtr handle) : base(handle)
            AVCaptureSession session;
            public void DidOutputSampleBuffer(AVCaptureOutput captureOutput, CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
                Console.WriteLine("DidOutputSampleBuffer is called");
            public override void ViewDidLoad()
                session = new AVCaptureSession();
                var mic = AVCaptureDevice.GetDefaultDevice(AVMediaType.Audio);
                if ( mic == null)
                    throw new Exception("Can't find devices");
                var micInput = AVCaptureDeviceInput.FromDevice(mic);
                if (session.CanAddInput(micInput))
                var output = new AVCaptureAudioDataOutput()
                    AudioSettings = new AudioSettings
                        Format = AudioToolbox.AudioFormatType.Flac,
                        SampleRate = 16000,
                        NumberChannels = 1,
                        AudioQuality = AVAudioQuality.High
                var queue = new CoreFoundation.DispatchQueue ("myQueue");
                output.SetSampleBufferDelegateQueue(this, queue);
            public override NSObject RepresentedObject
                get { return base.RepresentedObject; }
                set { base.RepresentedObject = value; }
  • ChrisHamonsChrisHamons USForum Administrator, Xamarin Team Xamurai

    Consider reading some of the documentation listed here:

    in particular:

    In Cocoa, to be called back you have to assign the delegate property of the item in question to the instance you want to be called on. Something like foo.Delegate = this; if your current item implements the relevant delegate.

    This is a very common pattern that you will need to become familiar with.

Sign In or Register to comment.