Problem with Xamarin Camera AV Capture Sample (memory leak)

TommyOvesenTommyOvesen ✭✭USMember ✭✭
edited May 2015 in Xamarin.iOS

This sample: https://developer.xamarin.com/samples/monotouch/AVCaptureFrames/

"An example on using AVFoundation and AVCaptureSession to display a real-time capture from a raw stream using a video camera."
Leaks memory big time it seams.

I try to incorporate the same feature in my app, and the applications output shows memory warning after a few seconds, then crashes silently.
I tried the sample itself on a real device (iPhone 6 plus IOS 8.3) and it also show memory warning. I also notice that the sample application output shows that reveral new treads are spawned regularly.

Here is the log from the sample:

Loaded assembly: /private/var/mobile/Containers/Bundle/Application/743F0F0D-F449-4BE0-A588-A6675193AFD2/avcaptureframes.app/.monotouch-32/avcaptureframes.exe
Thread started: #3
Thread started: #4
Thread started: #5
Thread started: #6
2015-05-04 19:54:36.060 avcaptureframes[8326:1512472] Received memory warning.
Thread started: #7
Thread started: #8
Thread started: #9
2015-05-04 19:54:59.088 avcaptureframes[8326:1512472] Received memory warning.
Thread started: #10
Thread started: #11
2015-05-04 19:55:22.600 avcaptureframes[8326:1512472] Received memory warning.
Thread started: #3
Thread started: #4
2015-05-04 19:55:46.842 avcaptureframes[8326:1512472] Received memory warning.
Thread started: #5
Thread started: #7
Thread started: #12

Why all these threads are started ?

I cannot figure out that is wrong with this code and how to fix it. I cannot find another working sample.

Any ideas on how to make a camera app with Xamarin that works for more than 5 seconds will be appreciated :-)

Here is the code that is problematic:

//
// How to capture video frames from the camera as images using AVFoundation sample
//
// Based on Apple's technical Q&A QA1702 sample
//
//
using System;
using System.Collections.Generic;
using System.Linq;
using CoreGraphics;
using Foundation;
using UIKit;
using AVFoundation;
using CoreVideo;
using CoreMedia;

using CoreFoundation;
using System.Runtime.InteropServices;

namespace avcaptureframes
{
    public class Application
    {
        static void Main (string[] args)
        {
            UIApplication.Main (args);
        }
    }

    public partial class AppDelegate : UIApplicationDelegate
    {
        public static UIImageView ImageView;
        UIViewController vc;
        AVCaptureSession session;
        OutputRecorder outputRecorder;
        DispatchQueue queue;

        public override bool FinishedLaunching (UIApplication app, NSDictionary options)
        {
            ImageView = new UIImageView (new CGRect (20, 20, 280, 280));
            ImageView.ContentMode = UIViewContentMode.ScaleAspectFill;

            vc = new UIViewController ();
            vc.View = ImageView;
            window.RootViewController = vc;

            window.MakeKeyAndVisible ();
            window.BackgroundColor = UIColor.Black;

            if (!SetupCaptureSession ())
                window.AddSubview (new UILabel (new CGRect (20, 20, 200, 60)) { Text = "No input device" });

            return true;
        }

        bool SetupCaptureSession ()
        {
            // configure the capture session for low resolution, change this if your code
            // can cope with more data or volume
            session = new AVCaptureSession () {
                SessionPreset = AVCaptureSession.PresetMedium
            };

            // create a device input and attach it to the session
            var captureDevice = AVCaptureDevice.DefaultDeviceWithMediaType (AVMediaType.Video);
            if (captureDevice == null){
                Console.WriteLine ("No captureDevice - this won't work on the simulator, try a physical device");
                return false;
            }
            //Configure for 15 FPS. Note use of LockForConigfuration()/UnlockForConfiguration()
            NSError error = null;
            captureDevice.LockForConfiguration(out error);
            if(error != null)
            {
                Console.WriteLine(error);
                captureDevice.UnlockForConfiguration();
                return false;
            }
            if(UIDevice.CurrentDevice.CheckSystemVersion(7,0))
                captureDevice.ActiveVideoMinFrameDuration = new CMTime (1,15);
            captureDevice.UnlockForConfiguration();

            var input = AVCaptureDeviceInput.FromDevice (captureDevice);
            if (input == null){
                Console.WriteLine ("No input - this won't work on the simulator, try a physical device");
                return false;
            }
            session.AddInput (input);

            // create a VideoDataOutput and add it to the sesion
            var output = new AVCaptureVideoDataOutput () {
                WeakVideoSettings = new CVPixelBufferAttributes () {
                                    PixelFormatType = CVPixelFormatType.CV32BGRA
                                     }.Dictionary,
            };

            // configure the output
            queue = new CoreFoundation.DispatchQueue ("myQueue");
            outputRecorder = new OutputRecorder ();
            output.SetSampleBufferDelegate (outputRecorder, queue);
            session.AddOutput (output);

            session.StartRunning ();
            return true;
        }

        public override void OnActivated (UIApplication application)
        {
        }

        public class OutputRecorder : AVCaptureVideoDataOutputSampleBufferDelegate {
            public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
            {
                try {
                    var image = ImageFromSampleBuffer (sampleBuffer);

                    // Do something with the image, we just stuff it in our main view.
                    AppDelegate.ImageView.BeginInvokeOnMainThread (delegate {
                        AppDelegate.ImageView.Image = image;
                        AppDelegate.ImageView.Transform = CGAffineTransform.MakeRotation((float)Math.PI/2);

                    });

                    //
                    // Although this looks innocent "Oh, he is just optimizing this case away"
                    // this is incredibly important to call on this callback, because the AVFoundation
                    // has a fixed number of buffers and if it runs out of free buffers, it will stop
                    // delivering frames.
                    //
                    sampleBuffer.Dispose ();
                } catch (Exception e){
                    Console.WriteLine (e);
                }
            }

            UIImage ImageFromSampleBuffer (CMSampleBuffer sampleBuffer)
            {
                // Get the CoreVideo image
                using (var pixelBuffer = sampleBuffer.GetImageBuffer () as CVPixelBuffer){
                    // Lock the base address
                    pixelBuffer.Lock (0);
                    // Get the number of bytes per row for the pixel buffer
                    var baseAddress = pixelBuffer.BaseAddress;
                    int bytesPerRow = (int) pixelBuffer.BytesPerRow;
                    int width = (int) pixelBuffer.Width;
                    int height = (int) pixelBuffer.Height;
                    var flags = CGBitmapFlags.PremultipliedFirst | CGBitmapFlags.ByteOrder32Little;
                    // Create a CGImage on the RGB colorspace from the configured parameter above
                    using (var cs = CGColorSpace.CreateDeviceRGB ())
                    using (var context = new CGBitmapContext (baseAddress,width, height, 8, bytesPerRow, cs, (CGImageAlphaInfo) flags))
                    using (var cgImage = context.ToImage ()){
                        pixelBuffer.Unlock (0);
                        return UIImage.FromImage (cgImage);
                    }
                }
            }
        }
    }
}

Best Answer

Answers

Sign In or Register to comment.