微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

objective-c – 使用CGDisplayStream编码H.264压缩会话

我正在尝试使用屏幕上的数据创建一个H.264压缩会话.我已经创建了一个CGdisplayStreamRef实例,如下所示:

displayStream = CGdisplayStreamCreateWithdispatchQueue(0,100,k32BGRAPixelFormat,nil,self.screenCaptureQueue,^(CGdisplayStreamFrameStatus status,uint64_t displayTime,IOSurfaceRef frameSurface,CGdisplayStreamUpdateRef updateRef) {
    //Call encoding session here
});

以下是我目前如何设置编码功能

- (void) encode:(CMSampleBufferRef )sampleBuffer {
    CVImageBufferRef imageBuffer = (CVImageBufferRef)CMSampleBufferGetimageBuffer(sampleBuffer);
    CMTime presentationTimeStamp = CMTimeMake(frameID++,1000);
    VTEncodeInfoFlags flags;
    Osstatus statusCode = VTCompressionSessionEncodeFrame(EncodingSession,imageBuffer,presentationTimeStamp,kCMTimeInvalid,NULL,&flags);
    if (statusCode != noErr) {
        NSLog(@"H264: VTCompressionSessionEncodeFrame Failed with %d",(int)statusCode);

        VTCompressionSessionInvalidate(EncodingSession);
        CFRelease(EncodingSession);
        EncodingSession = NULL;
        return;
    }
    NSLog(@"H264: VTCompressionSessionEncodeFrame Success");
}

我试图了解如何将数据从我的屏幕转换为CMSampleBufferRef,以便我可以正确调用我的编码功能.到目前为止,我还无法确定这是否可行,或者是我正在尝试做的正确方法.有没有人有什么建议?

编辑:我已经将我的IOSurface转换为CMBlockBuffer,但尚未弄清楚如何将其转换为CMSampleBufferRef:

void *mem = IOSurfaceGetBaseAddress(frameSurface);
size_t bytesPerRow = IOSurfaceGetBytesPerRow(frameSurface);
size_t height = IOSurfaceGetHeight(frameSurface);
size_t totalBytes = bytesPerRow * height;

CMBlockBufferRef blockBuffer;

CMBlockBufferCreateWithMemoryBlock(kcfAllocatorNull,mem,totalBytes,kcfAllocatorNull,&blockBuffer);

编辑2

更进一步:

CMSampleBufferRef *sampleBuffer;

Osstatus sampleStatus = CMSampleBufferCreate(
                             NULL,blockBuffer,TRUE,1,sampleBuffer);

[self encode:*sampleBuffer];

解决方法

可能,我有点迟了但是,它可能对其他人有帮助:

CGdisplayStreamCreateWithdispatchQueue(CGMaindisplayID(),CGdisplayStreamUpdateRef updateRef) {
    // The created pixel buffer retains the surface object.
    CVPixelBufferRef pixelBuffer;
    CVPixelBufferCreateWithIOSurface(NULL,frameSurface,&pixelBuffer);

    // Create the video-type-specific description for the pixel buffer.
    CMVideoFormatDescriptionRef videoFormatDescription;
    CMVideoFormatDescriptionCreateForImageBuffer(NULL,pixelBuffer,&videoFormatDescription);

    // All the necessary parts for creating a `CMSampleBuffer` are ready.
    CMSampleBufferRef sampleBuffer;
    CMSampleTimingInfo timingInfo;
    CMSampleBufferCreateReadyWithImageBuffer(NULL,videoFormatDescription,&timingInfo,&sampleBuffer);

    // Do the stuff

    // Release the resources to let the frame surface be reused in the queue
    // `kCGdisplayStreamQueueDepth` is responsible for the size of the queue 
    CFRelease(sampleBuffer);
    CFRelease(pixelBuffer);
});

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐