OpenGL ES iOS笔记04 使用AVSampleBufferDisplayLayer渲染视频

上一篇笔记

使用OpenGL渲染视频数据需要了解OpenGL的各种操作、函数调用,不是很符合OC习惯,如果只是简单的渲染显示功能,也可以使用苹果在iOS8中新推出的AVSampleBufferDisplayLayer类。

这里还是以iOS设备摄像头采集的数据做显示。

创建AVSampleBufferDisplayLayer

根据名字可以知道AVSampleBufferDisplayLayer是CALayer的子类,因此可以添加到UIView的layer上,这里创建一个新的View,用于以AVSampleBufferDisplayLayer渲染视频。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
@interface LYRSampleBufferDisplayView ()
@property(nonatomic,strong)AVSampleBufferDisplayLayer*displayLayer;
@end
@implementation LYRSampleBufferDisplayView

-(instancetype)initWithFrame:(CGRect)frame
{
    if (self = [super initWithFrame:frame]) {
        [self createLayer];
    }
    return self;
}
-(instancetype)initWithCoder:(NSCoder *)aDecoder
{
    if (self = [super initWithCoder:aDecoder]) {
        [self createLayer];
    }
    return self;
}
-(void)layoutSubviews
{
    [super layoutSubviews];
    //由于layer的frame改变时会有隐式的动画,所以需要手动禁止
    [CATransaction setDisableActions:YES];
    self.displayLayer.frame = self.bounds;
}

-(void)createLayer
{
    self.displayLayer = [AVSampleBufferDisplayLayer layer];
    self.displayLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [self.layer addSublayer:self.displayLayer];
}

@end

渲染显示CMSampleBuffer

AVSampleBufferDisplayLayer中有方法

1
- (void)enqueueSampleBuffer:(CMSampleBufferRef)sampleBuffer;

对于从videoDataOutput中得到的数据,可以直接传入并显示。

1
2
3
4
-(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    [self.renderView enqueueSampleBuffer:sampleBuffer];
}

运行程序,即可看到显示的画面。

渲染普通的buffer数据

当然,一般真正的应用场景中,是不会直接有CMSampleBuffer用来显示的,例如使用ffmpeg解码后的数据,一般是存放在AVFrame的结构体中,*data指针,因此,需要将buffer包装成CMSampleBuffer来给AVSampleBufferDisplayLayer显示。这里先使用RGBA的数据格式。

使用CVPixelBufferPool获取CVPixelBuffer

创建CMSampleBuffer,需要CVPixelBuffer,这里使用CVPixelBufferPool,这样由缓存池决定创建和销毁CVPixelBuffer,不用关心CVPixelBuffer导致的内存问题。

1
2
3
4
@interface LYRSampleBufferDisplayView ()
{
    CVPixelBufferPoolRef _pixelBufferPool;
}

这里创建一个方法,参数包括RGBA数据buffer指针,视频的宽高

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
- (void)displayWithRGBBuffer:(uint8_t*)displayWithRGBBuffer linesize:(int)lineSize width:(int)width height:(int)height
{
    CVReturn theError;
    if (_pixelBufferPool) {
        CVPixelBufferPoolFlush(_pixelBufferPool, kCVPixelBufferPoolFlushExcessBuffers);
        CVPixelBufferPoolRelease(_pixelBufferPool);
            _pixelBufferPool = NULL;
    }
    
    if (!_pixelBufferPool){
        NSMutableDictionary* attributes = [NSMutableDictionary dictionary];
        //        kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange nv12
        [attributes setObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
        [attributes setObject:[NSNumber numberWithInt:width] forKey: (NSString*)kCVPixelBufferWidthKey];
        [attributes setObject:[NSNumber numberWithInt:height] forKey: (NSString*)kCVPixelBufferHeightKey];
        [attributes setObject:@(16) forKey:(NSString*)kCVPixelBufferBytesPerRowAlignmentKey];
        [attributes setObject:[NSDictionary dictionary] forKey:(NSString*)kCVPixelBufferIOSurfacePropertiesKey];
        theError = CVPixelBufferPoolCreate(kCFAllocatorDefault, NULL, (__bridge CFDictionaryRef) attributes, &_pixelBufferPool);
        if (theError != kCVReturnSuccess){
            NSLog(@"CVPixelBufferPoolCreate Failed");
            _pixelBufferPool = NULL;
            return;
        }
    }
}

这里根据视频宽高和视频格式创建了一个CVPixelBufferPool。需要注意的是kCVPixelBufferBytesPerRowAlignmentKey这个参数,这个参数实际上是设置行宽(linesize/BytesPerRow)的基准而不是真正的行宽。真正的行宽会根据设置的宽高和视频格式决定,例如32BGRA的linesize一般是width*4。

创建CVPixelBuffer,并将视频数据传入

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
- (void)displayWithRGBBuffer:(uint8_t*)displayWithRGBBuffer width:(int)width height:(int)height
{
	...
	CVPixelBufferRef pixelBuffer = nil;
    theError = CVPixelBufferPoolCreatePixelBuffer(NULL, _pixelBufferPool, &pixelBuffer);
    if(theError != kCVReturnSuccess){
        NSLog(@"CVPixelBufferPoolCreatePixelBuffer Failed");
        pixelBuffer = NULL;
        return;
    }
    
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    
    void*base = CVPixelBufferGetBaseAddress(pixelBuffer);
    
    //将数据拷贝到base
    memcpy(base, RGBBuffer, width * height *4);
    if (base == NULL) {
        return;
    }

    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
}

使用CVPixelBuffer创建CMSampleBuffer并显示数据

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
- (void)displayPixelBuffer:(CVPixelBufferRef) pixelBuffer
{
    if (!pixelBuffer){
        return;
    }
    
    //不设置具体时间信息
    CMSampleTimingInfo timing = {kCMTimeInvalid, kCMTimeInvalid, kCMTimeInvalid};
    //获取视频信息
    CMVideoFormatDescriptionRef videoInfo = NULL;
    OSStatus result = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoInfo);
    NSParameterAssert(result == 0 && videoInfo != NULL);
    
    CMSampleBufferRef sampleBuffer = NULL;
    result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault,pixelBuffer, true, NULL, NULL, videoInfo, &timing, &sampleBuffer);
    NSParameterAssert(result == 0 && sampleBuffer != NULL);
    CFRelease(pixelBuffer);
    CFRelease(videoInfo);
    
    CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, YES);
    CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);
    CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);
    
    //这里是处理进入后台后layer失效问题
    if (self.displayLayer.status == AVQueuedSampleBufferRenderingStatusFailed) {
        [self.displayLayer flush];
    }
    
    [self.displayLayer enqueueSampleBuffer:sampleBuffer];
    CFRelease(sampleBuffer);
}

这样就完成了整个过程,接下来在采集数据的地方修改,将rgba数据传给渲染view。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
-(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    
    
    
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    
    if(CVPixelBufferLockBaseAddress(imageBuffer, 0) == kCVReturnSuccess)
    {
        UInt8 *rgbBuffer = (UInt8 *)CVPixelBufferGetBaseAddress(imageBuffer);
        
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
        
        [self.renderView displayWithRGBBuffer:rgbBuffer width:width height:height];
    }
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
}

之后运行程序,就可以看到采集的视频。

渲染NV12格式的数据

这里步骤和RGBA类似,就是创建CVPixelBufferPool时格式设置为NV12,拷贝数据的时候需要两路分量都拷贝。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
- (void)displayWithNV12yBuffer:(uint8_t*)yBuffer uvBuffer:(uint8_t*)uvBuffer width:(int)width height:(int)height
{
    CVReturn theError;
    if (_pixelBufferPool) {
        CVPixelBufferPoolFlush(_pixelBufferPool, kCVPixelBufferPoolFlushExcessBuffers);
        CVPixelBufferPoolRelease(_pixelBufferPool);
        _pixelBufferPool = NULL;
    }
    
    if (!_pixelBufferPool){
        NSMutableDictionary* attributes = [NSMutableDictionary dictionary];
        [attributes setObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
        [attributes setObject:[NSNumber numberWithInt:width] forKey: (NSString*)kCVPixelBufferWidthKey];
        [attributes setObject:[NSNumber numberWithInt:height] forKey: (NSString*)kCVPixelBufferHeightKey];
        [attributes setObject:@(16) forKey:(NSString*)kCVPixelBufferBytesPerRowAlignmentKey];
        [attributes setObject:[NSDictionary dictionary] forKey:(NSString*)kCVPixelBufferIOSurfacePropertiesKey];
        theError = CVPixelBufferPoolCreate(kCFAllocatorDefault, NULL, (__bridge CFDictionaryRef) attributes, &_pixelBufferPool);
        if (theError != kCVReturnSuccess){
            NSLog(@"CVPixelBufferPoolCreate Failed");
            _pixelBufferPool = NULL;
            return;
        }
    }
    
    CVPixelBufferRef pixelBuffer = nil;
    theError = CVPixelBufferPoolCreatePixelBuffer(NULL, _pixelBufferPool, &pixelBuffer);
    if(theError != kCVReturnSuccess){
        NSLog(@"CVPixelBufferPoolCreatePixelBuffer Failed");
        pixelBuffer = NULL;
        return;
    }
    
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    //取得buffer中存储视频数据的指针 根据通道序号取,0是y分量,1是uv分量
    void*y_base = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
    
    //将数据拷贝到base
    memcpy(y_base, yBuffer, width * height *1);//y通道的数据大小为宽乘以高
    if (y_base == NULL) {
        CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
        return;
    }
    
    void*uv_base = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
    
    memcpy(uv_base, uvBuffer, width * height *0.5);//uv通道的数据大小为宽乘以高的一半
    
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    
    [self displayPixelBuffer:pixelBuffer];
    
}

调用时先修改采集格式,然后依次取出两个分量buffer即可。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
-(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    
    if(CVPixelBufferLockBaseAddress(imageBuffer, 0) == kCVReturnSuccess)
    {
        //图像宽度(像素)
        size_t pixelWidth = CVPixelBufferGetWidth(imageBuffer);
        //图像高度(像素)
        size_t pixelHeight = CVPixelBufferGetHeight(imageBuffer);
        //获取CVImageBufferRef中的y数据
        uint8_t *y_frame = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
        //获取CMVImageBufferRef中的uv数据
        uint8_t *uv_frame = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 1);
        
        [self.renderView displayWithNV12yBuffer:y_frame uvBuffer:uv_frame width:pixelWidth height:pixelHeight];
    }
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
}

渲染yuv420p

渲染yuv420p和上面类似,可以将CVPixelBufferPool的格式设置为kCVPixelFormatType_420YpCbCr8Planar,然后CVPixelBuffer依次拷贝进3个分量即可,这里就不放代码了。需要注意的是,直接使用yuv420p显示,实测在ios10是无法显示的,而在ios11.3以下的ios11系统,显示有问题,画面会一闪一闪的,因此在低版本适配时需要转换成NV12格式进行显示,转换方法可以使用上一篇提到的libyuv。

总结

虽然AVSampleBufferDisplayLayer使用起来相对简单,但实际由于每次都有memcpy的操作,所以性能会有损失,再加上直接渲染yuv的版本限制,我个人更倾向使用OpenGL渲染的方式。

demo地址

参考:
在iOS端使用AVSampleBufferDisplayLayer进行视频渲染