使用从 [UInt8] 数据创建的 CGImage 频繁绘制 CGContext 时,iOS 应用程序崩溃

如何解决使用从 [UInt8] 数据创建的 CGImage 频繁绘制 CGContext 时,iOS 应用程序崩溃

现在我正在该模块中开发一个模块,我需要从数组 CGImage 创建视频,在执行此操作时,我的应用程序在某些时候崩溃了,我无法找出崩溃背后的确切原因。

任何人都可以建议我是否朝着正确的方向前进,我应该将 [CGImage] 转换为视频还是需要选择其他方法。

我也尝试将 CGImage 转换为 UIImage 并尝试创建视频,但仍然面临同样的问题。

我在 [UInt8] 数据中获取图像数据,那么转换图像格式和创建视频的正确方法是什么?

为了从 [CGImage] 创建视频,请遵循以下方法。 我正在使用 CGDataProvider 将 [UInt8] 数据转换为 CGImage,并将 CGImage 转换为 UIImage。我有一系列图像并收集 UIImage,然后合并图像并创建视频。

这是我从数据转换 CGImage 的代码。

private(set) var data: [UInt8]

var cgImage: CGImage? {

        let colorSpaceRef = CGColorSpaceCreateDeviceRGB()

        let bitsPerComponent = 8
        let bitsPerPixel = channels * bitsPerComponent
        let bytesPerRow = channels * width
        let totalBytes = height * bytesPerRow
        let bitmapInfo = CGBitmapInfo(rawValue: channels == 3 ? CGImageAlphaInfo.none.rawValue : CGImageAlphaInfo.last.rawValue)
        let provider = CGDataProvider( dataInfo: nil,data: data,size: totalBytes,releaseData: {_,_,_  in })!

        return CGImage(width: width,height: height,bitsPerComponent: bitsPerComponent,bitsPerPixel: bitsPerPixel,bytesPerRow: bytesPerRow,space: colorSpaceRef,bitmapInfo: bitmapInfo,provider: provider,decode: nil,shouldInterpolate: false,intent: CGColorRenderingIntent.perceptual)
    }

当我开始频繁地将图像绘制到上下文时,我的应用程序在此功能中崩溃

(context!.draw(cgImage,in: CGRect(x: 0,y: 0,width: frameWidth,高度:frameHeight)))

如果我使用捆绑包中的图像数量并使用此代码创建视频,则其工作正常。当我使用从 [UInt8] 数据创建的 CGImage 时,它​​在写入 3-4 个图像后开始崩溃。

func newPixelBufferFrom(cgImage:CGImage) -> CVPixelBuffer?{
        autoreleasepool {
            
            let options:[String: Any] = [kCVPixelBufferCGImageCompatibilityKey as String: true,kCVPixelBufferCGBitmapContextCompatibilityKey as String: true]
            var pxbuffer:CVPixelBuffer?
            let frameWidth = self.videoSettings[AVVideoWidthKey] as! Int
            let frameHeight = self.videoSettings[AVVideoHeightKey] as! Int
            let status = CVPixelBufferCreate(kCFAllocatorDefault,frameWidth,frameHeight,kCVPixelFormatType_32ARGB,options as CFDictionary?,&pxbuffer)
            assert(status == kCVReturnSuccess && pxbuffer != nil,"newPixelBuffer failed")

            CVPixelBufferLockBaseAddress(pxbuffer!,CVPixelBufferLockFlags(rawValue: 0))
            let pxdata = CVPixelBufferGetBaseAddress(pxbuffer!)
            let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
            let context = CGContext(data: pxdata,height: frameHeight,bitsPerComponent: 8,bytesPerRow: CVPixelBufferGetBytesPerRow(pxbuffer!),space: rgbColorSpace,bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue)
            assert(context != nil,"context is nil")
            context!.concatenate(CGAffineTransform.identity)
            context!.draw(cgImage,height: frameHeight))
            CVPixelBufferUnlockBaseAddress(pxbuffer!,CVPixelBufferLockFlags(rawValue: 0))
            return pxbuffer
}

在这里,我使用下面的代码从图像数组创建视频。

typealias CXEMovieMakerCompletion = (URL) -> Void
typealias CXEMovieMakerUIImageExtractor = (AnyObject) -> UIImage?


public class CXEImagesToVideo: NSObject{
    var assetWriter:AVAssetWriter!
    var writeInput:AVAssetWriterInput!
    var bufferAdapter:AVAssetWriterInputPixelBufferAdaptor!
    var videoSettings:[String : Any]!
    var frameTime:CMTime!
    var fileURL:URL!
    
    var completionBlock: CXEMovieMakerCompletion?
    var movieMakerUIImageExtractor:CXEMovieMakerUIImageExtractor?
    
    
    public class func  videoSettings(codec:String,width:Int,height:Int) -> [String: Any]{
        if(Int(width) % 16 != 0){
            print("warning: video settings width must be divisible by 16")
        }
        
        let videoSettings:[String: Any] = [AVVideoCodecKey: AVVideoCodecType.h264,AVVideoWidthKey: width,AVVideoHeightKey: height]
       
        return videoSettings
    }
    
    public init(videoSettings: [String: Any],frameTime: CMTime) {
        super.init()
        self.frameTime = frameTime
        let paths = NSSearchPathForDirectoriesInDomains(.documentDirectory,.userDomainMask,true)
        let tempPath = paths[0] + "/exprotvideo1.mp4"
        if(FileManager.default.fileExists(atPath: tempPath)){
            guard (try? FileManager.default.removeItem(atPath: tempPath)) != nil else {
                print("remove path failed")
                return
            }
        }
        
        self.fileURL = URL(fileURLWithPath: tempPath)
        self.assetWriter = try! AVAssetWriter(url: self.fileURL,fileType: AVFileType.mp4)
        
        self.videoSettings = videoSettings
        self.writeInput = AVAssetWriterInput(mediaType: AVMediaType.video,outputSettings: videoSettings)
        assert(self.assetWriter.canAdd(self.writeInput),"add failed")
        
        self.assetWriter.add(self.writeInput)
        let bufferAttributes:[String: Any] = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32ARGB)]
        self.bufferAdapter = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: self.writeInput,sourcePixelBufferAttributes: bufferAttributes)
        self.frameTime = CMTimeMake(value: 1,timescale: 10)
    }
    
    func createMovieFrom(urls: [URL],withCompletion: @escaping CXEMovieMakerCompletion){
        self.createMovieFromSource(images: urls as [AnyObject],extractor:{(inputObject:AnyObject) ->UIImage? in
                                    return UIImage(data: try! Data(contentsOf: inputObject as! URL))},withCompletion: withCompletion)
    }
    
    func createMovieFrom(images: [UIImage],withCompletion: @escaping CXEMovieMakerCompletion){
        DispatchQueue.main.async {
            self.createMovieFromSource(images: images,extractor: {(inputObject:AnyObject) -> UIImage? in
                                        return inputObject as? UIImage},withCompletion: withCompletion)
        }
        
    }
    func imageFromLayer(layer:CALayer) -> UIImage {
        UIGraphicsBeginImageContextWithOptions(layer.frame.size,layer.isOpaque,0)
        layer.render(in: UIGraphicsGetCurrentContext()!)
        let outputImage = UIGraphicsGetImageFromCurrentImageContext()
        UIGraphicsEndImageContext()
        return outputImage!
    }
    
    
    
    func createMovieFromSource(images: [AnyObject],extractor: @escaping CXEMovieMakerUIImageExtractor,withCompletion: @escaping CXEMovieMakerCompletion){

        self.completionBlock = withCompletion
        
        self.assetWriter.startWriting()
        self.assetWriter.startSession(atSourceTime: CMTime.zero)
        
        let mediaInputQueue = DispatchQueue.init(label: "Main") // DispatchQueue(label: "mediaInputQueue")
        var i = 0
        let frameNumber = images.count
        
            self.writeInput.requestMediaDataWhenReady(on: mediaInputQueue){
                while(true){
                    if(i >= frameNumber){
                        break
                    }
                    if (self.writeInput.isReadyForMoreMediaData){
                        var sampleBuffer:CVPixelBuffer?
                        autoreleasepool{
                            let temp = images[i]
                            let img = extractor(temp)
                            if img == nil{
                                i += 1
                                print("Warning: counld not extract one of the frames")
                                //continue
                            }

                            sampleBuffer = self.newPixelBufferFrom(cgImage: temp.cgImage!)
                            
                        }
                        if (sampleBuffer != nil){
                            if(i == 0){
                                self.bufferAdapter.append(sampleBuffer!,withPresentationTime: CMTime.zero)
                            }else{
                                let value = i - 1
                                let lastTime = CMTimeMake(value: Int64(value),timescale: self.frameTime.timescale)
                                let presentTime = CMTimeAdd(lastTime,self.frameTime)
                                self.bufferAdapter.append(sampleBuffer!,withPresentationTime: presentTime)
                            }
                            i = i + 1
                        }
                    }
                }
                self.writeInput.markAsFinished()
                self.assetWriter.finishWriting {
                    DispatchQueue.main.sync {
                        self.completionBlock!(self.fileURL)
                    }
                }
            }
    }
    
    func newPixelBufferFrom(cgImage:CGImage) -> CVPixelBuffer?{
        autoreleasepool {
            
            let options:[String: Any] = [kCVPixelBufferCGImageCompatibilityKey as String: true,bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue)
            // CGImageAlphaInfo.noneSkipFirst.rawValue
            assert(context != nil,"context is nil")
           // context?.clear(CGRect(x: 0,height: frameHeight))
            context!.concatenate(CGAffineTransform.identity)
            context!.draw(cgImage,CVPixelBufferLockFlags(rawValue: 0))
            return pxbuffer
}
}
}

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


使用本地python环境可以成功执行 import pandas as pd import matplotlib.pyplot as plt # 设置字体 plt.rcParams['font.sans-serif'] = ['SimHei'] # 能正确显示负号 p
错误1:Request method ‘DELETE‘ not supported 错误还原:controller层有一个接口,访问该接口时报错:Request method ‘DELETE‘ not supported 错误原因:没有接收到前端传入的参数,修改为如下 参考 错误2:cannot r
错误1:启动docker镜像时报错:Error response from daemon: driver failed programming external connectivity on endpoint quirky_allen 解决方法:重启docker -> systemctl r
错误1:private field ‘xxx‘ is never assigned 按Altʾnter快捷键,选择第2项 参考:https://blog.csdn.net/shi_hong_fei_hei/article/details/88814070 错误2:启动时报错,不能找到主启动类 #
报错如下,通过源不能下载,最后警告pip需升级版本 Requirement already satisfied: pip in c:\users\ychen\appdata\local\programs\python\python310\lib\site-packages (22.0.4) Coll
错误1:maven打包报错 错误还原:使用maven打包项目时报错如下 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-resources-plugin:3.2.0:resources (default-resources)
错误1:服务调用时报错 服务消费者模块assess通过openFeign调用服务提供者模块hires 如下为服务提供者模块hires的控制层接口 @RestController @RequestMapping("/hires") public class FeignControl
错误1:运行项目后报如下错误 解决方案 报错2:Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project sb 解决方案:在pom.
参考 错误原因 过滤器或拦截器在生效时,redisTemplate还没有注入 解决方案:在注入容器时就生效 @Component //项目运行时就注入Spring容器 public class RedisBean { @Resource private RedisTemplate<String
使用vite构建项目报错 C:\Users\ychen\work>npm init @vitejs/app @vitejs/create-app is deprecated, use npm init vite instead C:\Users\ychen\AppData\Local\npm-
参考1 参考2 解决方案 # 点击安装源 协议选择 http:// 路径填写 mirrors.aliyun.com/centos/8.3.2011/BaseOS/x86_64/os URL类型 软件库URL 其他路径 # 版本 7 mirrors.aliyun.com/centos/7/os/x86
报错1 [root@slave1 data_mocker]# kafka-console-consumer.sh --bootstrap-server slave1:9092 --topic topic_db [2023-12-19 18:31:12,770] WARN [Consumer clie
错误1 # 重写数据 hive (edu)> insert overwrite table dwd_trade_cart_add_inc > select data.id, > data.user_id, > data.course_id, > date_format(
错误1 hive (edu)> insert into huanhuan values(1,'haoge'); Query ID = root_20240110071417_fe1517ad-3607-41f4-bdcf-d00b98ac443e Total jobs = 1
报错1:执行到如下就不执行了,没有显示Successfully registered new MBean. [root@slave1 bin]# /usr/local/software/flume-1.9.0/bin/flume-ng agent -n a1 -c /usr/local/softwa
虚拟及没有启动任何服务器查看jps会显示jps,如果没有显示任何东西 [root@slave2 ~]# jps 9647 Jps 解决方案 # 进入/tmp查看 [root@slave1 dfs]# cd /tmp [root@slave1 tmp]# ll 总用量 48 drwxr-xr-x. 2
报错1 hive> show databases; OK Failed with exception java.io.IOException:java.lang.RuntimeException: Error in configuring object Time taken: 0.474 se
报错1 [root@localhost ~]# vim -bash: vim: 未找到命令 安装vim yum -y install vim* # 查看是否安装成功 [root@hadoop01 hadoop]# rpm -qa |grep vim vim-X11-7.4.629-8.el7_9.x
修改hadoop配置 vi /usr/local/software/hadoop-2.9.2/etc/hadoop/yarn-site.xml # 添加如下 <configuration> <property> <name>yarn.nodemanager.res