Camera Preview and Sample Buffer

Episode #296 | 10 minutes | published on August 4, 2017 | Uses Xcode-9.0-beta3, swift-4
Subscribers Only
We set up the preview layer so we can see the video, then we add a sample buffer queue so we can get access to the individual frames of the video coming through the capture session.

Previewing the Camera Input

We can utilize AVVideoCapturePreviewLayer in order to see what the camera input is capturing. We'll do this in the setupCaptureSession() method.

            let cameraInput = try AVCaptureDeviceInput(device: camera)
            captureSession.addInput(cameraInput)

            // after adding the input...            
            let preview = AVCaptureVideoPreviewLayer(session: captureSession)
            preview.frame = view.bounds
            preview.backgroundColor = UIColor.black.cgColor
            preview.videoGravity = .resizeAspect
            view.layer.addSublayer(preview)
            self.previewLayer = preview

Now we can see live video output!

Capturing the Samples

In order to capture samples we’ll have to add an output to our capture session. One of these output types is an AVCaptureVideoDataOutput.

            let output = AVCaptureVideoDataOutput()
            output.alwaysDiscardsLateVideoFrames = true
            output.setSampleBufferDelegate(self, queue: sampleBufferQueue)

We'll need to define that sampleBufferQueue at the top of the class:

let sampleBufferQueue = DispatchQueue.global(qos: .userInteractive)

We also need to conform to this protocol:

extension CaptureViewController : AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        print("sampleBuffer")
    }
}

Finally, we can add this output to the capture session.

            captureSession.addOutput(output)

Running this we see sampleBuffer repeated in the console, letting us know it's working!

blog comments powered by Disqus