如何解决AVFoundation + AssetWriter:生成带有图像和音频的电影
|| 我必须从我的iPhone应用程序中导出一部电影,其中包含NSArray中的UIImage,并添加一些.caf格式的音频文件,这些文件必须在预先指定的时间开始。 现在,我已经能够使用AVAssetWriter(在这个站点和其他站点上经过很多问题和解答之后)导出包含图像的视频部分,但似乎找不到增加音频文件以完成电影的方法。 这是到目前为止我得到的-(void) writeImagesToMovieAtPath:(Nsstring *) path withSize:(CGSize) size
{
NSLog(@\"Write Started\");
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterassert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264,AVVideoCodecKey,[NSNumber numberWithInt:size.width],AVVideoWidthKey,[NSNumber numberWithInt:size.height],AVVideoHeightKey,nil];
AVAssetWriterInput* videoWriterInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:nil];
NSParameterassert(videoWriterInput);
NSParameterassert([videoWriter canAddInput:videoWriterInput]);
videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput:videoWriterInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
//convert uiimage to CGImage.
int frameCount = 0;
for(UIImage * img in imageArray)
{
buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:size];
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < 30)
{
if (adaptor.assetWriterInput.readyForMoreMediaData)
{
printf(\"appending %d attemp %d\\n\",frameCount,j);
CMTime frameTime = CMTimeMake(frameCount,(int32_t) kRecordingFPS);
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
if(buffer)
CVBufferRelease(buffer);
[NSThread sleepForTimeInterval:0.05];
}
else
{
printf(\"adaptor not ready %d,%d\\n\",j);
[NSThread sleepForTimeInterval:0.1];
}
j++;
}
if (!append_ok) {
printf(\"error appending image %d times %d\\n\",j);
}
frameCount++;
}
}
//Finish the session:
[videoWriterInput markAsFinished];
[videoWriter finishWriting];
NSLog(@\"Write Ended\");
}
现在是pixelBufferFromCGImage的代码
- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image andSize:(CGSize) size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES],kCVPixelBufferCGImageCompatibilityKey,[NSNumber numberWithBool:YES],kCVPixelBufferCGBitmapContextCompatibilityKey,nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kcfAllocatorDefault,size.width,size.height,kCVPixelFormatType_32ARGB,(CFDictionaryRef) options,&pxbuffer);
NSParameterassert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer,0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterassert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata,8,4*size.width,rgbColorSpace,kCGImageAlphaNoneskipFirst);
NSParameterassert(context);
CGContextConcatCTM(context,CGAffineTransformMakeRotation(0));
CGContextDrawImage(context,CGRectMake(0,CGImageGetWidth(image),CGImageGetHeight(image)),image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer,0);
return pxbuffer;
}
因此,您能为我提供有关如何添加音频文件以及如何为其创建缓冲区以及适配器和输入设置等方面的帮助吗?
如果这种方法可能会引起问题,请指导我如何使用AVMutableComposition将图像阵列用于视频导出
解决方法
我最终使用以上代码分别导出了视频,并使用AVComposition和AVExportSession分别添加了音频文件。
这是代码
-(void) addAudioToFileAtPath:(NSString *) filePath toPath:(NSString *)outFilePath
{
NSError * error = nil;
AVMutableComposition * composition = [AVMutableComposition composition];
AVURLAsset * videoAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:filePath] options:nil];
AVAssetTrack * videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID: kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,videoAsset.duration) ofTrack:videoAssetTrack atTime:kCMTimeZero
error:&error];
CMTime audioStartTime = kCMTimeZero;
for (NSDictionary * audioInfo in audioInfoArray)
{
NSString * pathString = [audioInfo objectForKey:audioFilePath];
AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];
AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID: kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:audioStartTime error:&error];
audioStartTime = CMTimeAdd(audioStartTime,CMTimeMake((int) (([[audioInfo objectForKey:audioDuration] floatValue] * kRecordingFPS) + 0.5),kRecordingFPS));
}
AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
assetExport.videoComposition = mutableVideoComposition;
assetExport.outputFileType =AVFileTypeQuickTimeMovie;// @\"com.apple.quicktime-movie\";
assetExport.outputURL = [NSURL fileURLWithPath:outFilePath];
[assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
switch (assetExport.status)
{
case AVAssetExportSessionStatusCompleted:
// export complete
NSLog(@\"Export Complete\");
break;
case AVAssetExportSessionStatusFailed:
NSLog(@\"Export Failed\");
NSLog(@\"ExportSessionError: %@\",[assetExport.error localizedDescription]);
// export error (see exportSession.error)
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@\"Export Failed\");
NSLog(@\"ExportSessionError: %@\",[assetExport.error localizedDescription]);
// export cancelled
break;
}
}];
}
, 您能否用一个\“ audioInfo \”词典替换\“ for \”循环,该词典具有所有需要设置的值,以使其变得更加易于粘贴粘贴? :)
如果只想添加一个音频文件,则以下代码应替换for循环:
NSString * pathString = [self getAudioFilePath];
AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];
AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID: kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:kCMTimeZero error:&error];
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。