其他分享
首页 > 其他分享> > Metal 摄像头采集渲染

Metal 摄像头采集渲染

作者:互联网

一、创建MTKView

    self.mtkView = [[MTKView alloc] initWithFrame:self.view.bounds];
    self.mtkView.device = MTLCreateSystemDefaultDevice();
    self.mtkView.delegate = self;
    self.mtkView.framebufferOnly = NO;
    [self.view insertSubview:self.mtkView atIndex:0];

这里增加了framebufferOnly设置为NO,正常默认是YES,如果不设置该参数则在执行drawInMTKView的时候调用

[filter encodeToCommandBuffer:commandBuffer sourceTexture:self.texture destinationTexture:view.currentDrawable.texture];

会提示执行错误

-[MTLDebugComputeCommandEncoder setTexture:atIndex:]:380: failed assertion `frameBufferOnly texture not supported for compute.'

这里将sourceTexture通过filter的编码commandbuffer的方法写到了destinationTexture;由于如果不设置为NO则默认view.currentDrawable.texture只是只读的情况

 

在创建MTKView的时候增加

CVMetalTextureCacheCreate(NULL, NULL, self.mtkView.device, NULL, &_textureCache);

创建CVMetalTextureCacheRef _textureCache,这是Core Video的Metal纹理缓存

 

二、由于是实时显示摄像头的内容,则需要从捕获摄像头内容到屏幕显示这样的一个过程

1、该过程我就把它称之为捕获回话AVCaptureSession;则先创建一个该会话

    self.mCaptureSession = [[AVCaptureSession alloc] init];
    self.mCaptureSession.sessionPreset = AVCaptureSessionPreset1920x1080;

这里增加了预设值1920x1080,指定了接收者(屏幕要显示的分辨率)在摄像头的分辨率范围内,设置的值越大则越清晰。

 

2、设置摄像头为捕获会话输入的设备

    // 输入设置
    // 输入有前置和后置,这里是用前置
    AVCaptureDevice* inputDevice = nil;
    NSArray* devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice* device in devices) {
        if ([device position] == AVCaptureDevicePositionBack) {
            inputDevice = device;
        }
    }
    self.mCaptureInput = [[AVCaptureDeviceInput alloc] initWithDevice:inputDevice error:NULL];
    if ([self.mCaptureSession canAddInput:self.mCaptureInput]) {
        [self.mCaptureSession addInput:self.mCaptureInput];
    }

这里可以看到,先获取所有的摄像头,然后枚举找到后置的摄像头作为输入的摄像头。

判断能否加入输入设备,再加入。

 

3、设置输出到屏幕上的数据格式

    // 输出设置,输出到界面的数据
    // 由于是一针一针的输出,则当显示延迟的时候是否要丢针
    // 甚至显示的格式
    // 设置显示的函数回调,和使用多线程的序列线程
    self.mCaptureOutput = [[AVCaptureVideoDataOutput alloc] init];
    [self.mCaptureOutput setAlwaysDiscardsLateVideoFrames:NO];
    // 这里设置格式为BGRA,而不用YUV的颜色空间,避免使用Shader转换
    [self.mCaptureOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
    self.mProcessQueue = dispatch_queue_create("mProcessQueue", DISPATCH_QUEUE_SERIAL);
    [self.mCaptureOutput setSampleBufferDelegate:self queue:self.mProcessQueue];
    if ([self.mCaptureSession canAddOutput:self.mCaptureOutput]) {
        [self.mCaptureSession addOutput:self.mCaptureOutput];
    }

由于每一针都是要按照先后顺序显示的,所以使用了线程的序列。并设置了本身为代理的类,则通过代理的协议实现对应的函数captureOutput

 

4、设置显示的连接方向

// 设置显示的连接方向
    AVCaptureConnection* connection = [self.mCaptureOutput connectionWithMediaType:AVMediaTypeVideo];
    [connection setVideoOrientation:AVCaptureVideoOrientationPortrait];

通过输出设备来设置视频相对于home键的方向

5、最后串联了输入和输出之后则开始运行

[self.mCaptureSession startRunning];

 

三、captureOutput 是从输出设备中捕获到的,需要对该捕获的内容进行适当的处理为drawInMTKView需要的内容进行显示

- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    size_t width = CVPixelBufferGetWidth(pixelBuffer);
    size_t height = CVPixelBufferGetHeight(pixelBuffer);
    
    CVMetalTextureRef tempTexture = nil;
    // 这里使用的BGRA跟kCVPixelFormatType_32BGRA的对应
    CVReturn status = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, self.textureCache, pixelBuffer, NULL, MTLPixelFormatBGRA8Unorm, width, height, 0, &tempTexture);
    if (status == kCVReturnSuccess) {
        self.mtkView.drawableSize = CGSizeMake(width, height);
        self.texture = CVMetalTextureGetTexture(tempTexture);
        CFRelease(tempTexture);
    }
}
摄像头回传CMSampleBufferRef数据,找到CVPixelBufferRef,用CVMetalTextureCacheCreateTextureFromImage创建CoreVideo的Metal纹理缓存CVMetalTextureRef,最后通过CVMetalTextureGetTexture得到Metal的纹理;

四、每一帧的MTKView的显示回调drawInMTKView

- (void)drawInMTKView:(nonnull MTKView *)view {
    if (self.texture) {
        id<MTLCommandBuffer> commandBuffer = [self.commandQueue commandBuffer];
        MPSImageGaussianBlur* filter = [[MPSImageGaussianBlur alloc] initWithDevice:self.mtkView.device sigma:10];
        [filter encodeToCommandBuffer:commandBuffer sourceTexture:self.texture destinationTexture:view.currentDrawable.texture];
        [commandBuffer presentDrawable:view.currentDrawable];
        [commandBuffer commit];
        self.texture = nil;
    }
}

1、MPSImageGaussianBlur的头文件是#import <MetalPerformanceShaders/MetalPerformanceShaders.h>

这个是metal显示的shader相关则需要导入该头文件

2、创建完成之后通过encodeToCommandBuffer 执行:Encode a MPSKernel into a command Buffer.  The operation shall proceed out-of-place.

 

参考:https://www.jianshu.com/p/d3d698120891

标签:渲染,self,Metal,texture,设置,mtkView,摄像头,view
来源: https://www.cnblogs.com/czwlinux/p/15743277.html