如何解决使用 vImage_Scale 和 kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
我从 iPhone 的前置摄像头收到了 CMSampleBuffer
。目前它的大小是 1920x1080,我想把它缩小到 1280x720。我想使用 vImageScale 函数,但无法正常工作。来自相机的像素格式是 kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
,所以我尝试了以下方法,但它输出了一个不正确的奇怪的绿色图像:
private var scaleBuffer: vImage_Buffer = {
var scaleBuffer: vImage_Buffer = vImage_Buffer()
let newHeight = 720
let newWidth = 1280
scaleBuffer.data = UnsafeMutableRawPointer.allocate(byteCount: Int(newWidth * newHeight * 4),alignment: MemoryLayout<UInt>.size)
scaleBuffer.width = vImagePixelCount(newWidth)
scaleBuffer.height = vImagePixelCount(newHeight)
scaleBuffer.rowBytes = Int(newWidth * 4)
return scaleBuffer
}()
func captureOutput(_ output: AVCaptureOutput,didOutput sampleBuffer: CMSampleBuffer,from connection: AVCaptureConnection)
{
guard let imageBuffer = CMSampleBufferGetimageBuffer(sampleBuffer) else {
return
}
CVPixelBufferLockBaseAddress(imageBuffer,CVPixelBufferLockFlags(rawValue: 0))
// create vImage_Buffer out of CVImageBuffer
var inBuff: vImage_Buffer = vImage_Buffer()
inBuff.width = UInt(CVPixelBufferGetWidth(imageBuffer))
inBuff.height = UInt(CVPixelBufferGetHeight(imageBuffer))
inBuff.rowBytes = CVPixelBufferGetBytesPerRow(imageBuffer)
inBuff.data = CVPixelBufferGetBaseAddress(imageBuffer)
// perform scale
var err = vImageScale_CbCr8(&inBuff,&scaleBuffer,nil,0)
if err != kvImageNoError {
print("Can't scale a buffer")
return
}
CVPixelBufferUnlockBaseAddress(imageBuffer,CVPixelBufferLockFlags(rawValue: 0))
var newBuffer: CVPixelBuffer?
let attributes : [NSObject:AnyObject] = [
kCVPixelBufferCGImageCompatibilityKey : true as AnyObject,kCVPixelBufferCGBitmapContextCompatibilityKey : true as AnyObject
]
let status = CVPixelBufferCreateWithBytes(kcfAllocatorDefault,Int(scaleBuffer.width),Int(scaleBuffer.height),kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,scaleBuffer.data,Int(scaleBuffer.width) * 4,attributes as CFDictionary?,&newBuffer)
guard status == kCVReturnSuccess,let b = newBuffer else {
return
}
// Do something with the buffer to output it
}
这里出了什么问题?看看这个答案 here,看起来我需要分别缩放“Y”和“UV”平面。我怎样才能迅速做到这一点,然后将它们组合回一个 CVPixelBuffer?
解决方法
从 imageBuffer
返回的 CMSampleBufferGetImageBuffer
实际上包含两个离散平面 - 一个亮度平面和一个色度平面(请注意,对于 420,色度平面是亮度平面的一半大小)。这在 this sample code project 中讨论。
这让您几乎完成了。我没有使用 Core Video CVPixelBufferCreateWithBytes
的经验,但此代码将为您创建缩放的 Yp
和 CbCr
缓冲区,并将它们转换为交错的 ARGB 缓冲区:
let lumaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer,0)
let lumaWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer,0)
let lumaHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer,0)
let lumaRowBytes = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer,0)
var sourceLumaBuffer = vImage_Buffer(data: lumaBaseAddress,height: vImagePixelCount(lumaHeight),width: vImagePixelCount(lumaWidth),rowBytes: lumaRowBytes)
let chromaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer,1)
let chromaWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer,1)
let chromaHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer,1)
let chromaRowBytes = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer,1)
var sourceChromaBuffer = vImage_Buffer(data: chromaBaseAddress,height: vImagePixelCount(chromaHeight),width: vImagePixelCount(chromaWidth),rowBytes: chromaRowBytes)
var destLumaBuffer = try! vImage_Buffer(size: CGSize(width: Int(sourceLumaBuffer.width / 4),height: Int(sourceLumaBuffer.height / 4)),bitsPerPixel: 8)
var destChromaBuffer = try! vImage_Buffer(size: CGSize(width: Int(sourceChromaBuffer.width / 4),height: Int(sourceChromaBuffer.height / 4)),bitsPerPixel: 8 * 2)
vImageScale_CbCr8(&sourceChromaBuffer,&destChromaBuffer,nil,0)
vImageScale_Planar8(&sourceLumaBuffer,&destLumaBuffer,0)
var argbBuffer = try! vImage_Buffer(size: destLumaBuffer.size,bitsPerPixel: 8 * 4)
vImageConvert_420Yp8_CbCr8ToARGB8888(&destLumaBuffer,&argbBuffer,&infoYpCbCrToARGB,255,vImage_Flags(kvImagePrintDiagnosticsToConsole))
destLumaBuffer.free()
destChromaBuffer.free()
argbBuffer.free()
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。