微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

ios – 播放堆叠的视频

我有多个imageview子视图根据我的传入数据进行堆叠.基本上所有这些子视图都是基于我的传入数据设置为图像或视频图层.我遇到的问题是播放视频.我可以在堆叠中播放第一个视频,但每个视频之后只是第一个视频的声音.我如何相应地玩?

通过点击事件(如snapchat)浏览视图.见下文:

@interface SceneImageViewController ()

@property (strong,nonatomic) NSURL *videoUrl;
@property (strong,nonatomic) AVPlayer *avPlayer;
@property (strong,nonatomic) AVPlayerLayer *avPlayerLayer;

@end

@implementation SceneImageViewController

- (void)viewDidLoad {

[super viewDidLoad];

self.mySubviews = [[NSMutableArray alloc] init];
self.videoCounterTags = [[NSMutableArray alloc] init];

int c = (int)[self.scenes count];
c--;
NSLog(@"int c = %d",c);
self.myCounter = [NSNumber numberWithInt:c];


for (int i=0; i<=c; i++) {

    //create imageView
    UIImageView *imageView =[[UIImageView alloc] initWithFrame:CGRectMake(0,self.view.bounds.size.width,self.view.bounds.size.height)];
    [imageView setUserInteractionEnabled:YES]; // <--- This is very important
    imageView.tag = i;                        // <--- Add tag to track this subview in the view stack
    [self.view addSubview:imageView];
    NSLog(@"added image view %d",i);


    //get scene object
    PFObject *sceneObject = self.scenes[i];


    //get the PFFile and filetype
    PFFile *file = [sceneObject objectForKey:@"file"];
    Nsstring *fileType = [sceneObject objectForKey:@"fileType"];



    //check the filetype
    if ([fileType  isEqual: @"image"])
    {
        dispatch_async(dispatch_get_global_queue(disPATCH_QUEUE_PRIORITY_HIGH,0),^{
        //get image
        NSURL *imageFileUrl = [[NSURL alloc] initWithString:file.url];
        NSData *imageData = [NSData dataWithContentsOfURL:imageFileUrl];
            dispatch_async(dispatch_get_main_queue(),^{
        imageView.image = [UIImage imageWithData:imageData];
            });
        });

    }

    //its a video
    else
    {
        // the video player
        NSURL *fileUrl = [NSURL URLWithString:file.url];

        self.avPlayer = [AVPlayer playerWithURL:fileUrl];
        self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;

        self.avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
        //self.avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

        [[NSNotificationCenter defaultCenter] addobserver:self
                                                 selector:@selector(playerItemDidReachEnd:)
                                                     name:AVPlayerItemDidplayToEndTimeNotification
                                                   object:[self.avPlayer currentItem]];

        CGRect screenRect = [[UIScreen mainScreen] bounds];

        self.avPlayerLayer.frame = CGRectMake(0,screenRect.size.width,screenRect.size.height);
        [imageView.layer addSublayer:self.avPlayerLayer];

        NSNumber *tag = [NSNumber numberWithInt:i+1];

        NSLog(@"tag = %@",tag);

        [self.videoCounterTags addobject:tag];

        //[self.avPlayer play];
    }



}



UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(viewTapped:)];

[self.view bringSubviewToFront:self.screen];

[self.screen addGestureRecognizer:tapGesture];


}


 - (void)viewTapped:(UIGestureRecognizer *)gesture{

NSLog(@"touch!");

[self.avPlayer pause];

int i = [self.myCounter intValue];
NSLog(@"counter = %d",i);



for(UIImageView *subview in [self.view subviews]) {

    if(subview.tag== i) {

        [subview removeFromSuperview];
    }
}

if ([self.videoCounterTags containsObject:self.myCounter]) {
    NSLog(@"play video!!!");
    [self.avPlayer play];
}

if (i == 0) {
    [self.avPlayer pause];
    [self.navigationController popViewControllerAnimated:NO];
}


i--;
self.myCounter = [NSNumber numberWithInt:i];


NSLog(@"counter after = %d",i);





}

解决方法

布鲁克斯·赫内斯说什么是正确的,你不要超越玩家.
这是我建议你做的:

>将点击手势添加到imageView而不是屏幕(或更清洁的方法使用UIButton代替):

UIImageView *imageView =[[UIImageView alloc] initWithFrame:CGRectMake(0,self.view.bounds.size.height)];
[imageView setUserInteractionEnabled:YES]; // <--- This is very important
imageView.tag = i;                        // <--- Add tag to track this subview in the view stack
[self.view addSubview:imageView];
NSLog(@"added image view %d",i);
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:imageView action:@selector(viewTapped:)];
[imageView addGestureRecognizer:tapGesture];

这样在你的viewTapped方法中,你可以得到如下所示的压缩图像标签:gesture.view.tag而不是使用myCounter.

>要使视频工作,您可以为每个视频创建一个新的AVPlayer,但这可能会变得相当昂贵的内存明智.更好的方法是使用AVPlayerItem并在更改视频时切换AVPlayer的AVPlayerItem.

所以在for循环中做一些这样的事情,其中​​self.videoFiles是一个NSMutableDictionary属性

// the video player
            NSNumber *tag = [NSNumber numberWithInt:i+1];
            NSURL *fileUrl = [NSURL URLWithString:file.url];
          //save your video file url paired with the ImageView it belongs to.
           [self.videosFiles setobject:fileUrl forKey:tag];
// you only need to initialize the player once.
            if(self.avPlayer == nil){
                AVAsset *asset = [AVAsset assetWithURL:fileUrl];
                AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
                self.avPlayer = [[AVPlayer alloc] initWithPlayerItem:item];
                self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone;
                [[NSNotificationCenter defaultCenter] addobserver:self
                    selector:@selector(playerItemDidReachEnd:)
                name:AVPlayerItemDidplayToEndTimeNotification
                object:[self.avPlayer currentItem]];
            }
            // you don't need to keep the layer as a property 
            // (unless you need it for some reason 
            AVPlayerLayer* avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
            avPlayerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

            CGRect screenRect = [[UIScreen mainScreen] bounds];                
            avPlayerLayer.frame = CGRectMake(0,screenRect.size.height);
            [imageView.layer addSublayer:avPlayerLayer];
            NSLog(@"tag = %@",tag);                
            [self.videoCounterTags addobject:tag];

现在在你的看法:

if ([self.videoCounterTags containsObject:gesture.view.tag]) { 

  NSLog(@"play video!!!");
    AVAsset *asset = [AVAsset assetWithURL:[self.videoFiles objectForKey:gesture.view.tag]];
    AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
    self.avPlayer replaceCurrentItemWithPlayerItem: item];
    [self.avLayer play];
}

或者使用self.videoFiles,然后你根本不需要self.videoCounterTags:

NSURL* fileURL = [self.videoFiles objectForKey:gesture.view.tag];
 if (fileURL!=nil) {    
     NSLog(@"play video!!!");
     AVAsset *asset = [AVAsset assetWithURL:fileURL];
     AVPlayerItem *item = [[AVPlayerItem alloc] initWithAsset:asset];
     self.avPlayer replaceCurrentItemWithPlayerItem: item];
     [self.avLayer play];
 }

这是它的要点.

原文地址:https://www.jb51.cc/iOS/329751.html

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐