微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

2 个视频 1 个媒体控制器

如何解决2 个视频 1 个媒体控制器

我希望并排显示 2 个视频视图。我不希望实现自定义 MediaController,因为认值非常好,但是,无论我做什么,我都无法同时控制 2 个视频。

    val mediaController = MediaController(requireContext())
    mediaController.setAnchorView(videoViewF)
    videoViewF.setMediaController(mediaController)
    videoViewR.setMediaController(mediaController)

我怎样才能做到这一点?当进度更改/暂停/播放时,我可以从 MediaController 或第一个 VideoView 获得回调吗?或者其他方式?

解决方法

双 VideoView 在 Android 中播放 3gp 视频

此示例将解释如何在布局中包含 2 个视频视图以同时播放不同的 3gp。

算法:

1.) 通过 File-> New -> Android Project 创建一个新项目,将其命名为 DualVideoViewExample。

2.) 在 main.xml 中写入以下内容:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:orientation="vertical" >

<TextView
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:text="Dual VideoView" />
<LinearLayout
android:orientation="vertical"
android:layout_width="fill_parent"
android:layout_height="match_parent">
<VideoView
android:id="@+id/myvideoview"
android:layout_width="fill_parent"
android:layout_height="wrap_content" />
<VideoView
android:id="@+id/myvideoview2"
android:layout_width="fill_parent"
android:layout_height="wrap_content" />
</LinearLayout>
</LinearLayout>

3.) 将两个 3gp 视频文件放在 res/raw 文件夹中。 4.) 运行输出。

步骤:

1.) 创建一个名为 DualVideoViewExample 的项目,并按照图片中的说明设置信息。

构建目标:Android 4.4 应用名称:DualVideoViewExample 包名:com.example.DualVideoViewExample 活动名称:DualVideoViewExampleActivity

dualvideoview1

2.) 打开 DualVideoViewExampleActivity.java 文件并在那里编写以下代码:

package com.example.dualvideoviewexample;

import android.app.Activity;
import android.net.Uri;
import android.os.Bundle;
import android.widget.MediaController;
import android.widget.VideoView;

public class DualVideoViewExampleActivity extends Activity {

/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
VideoView myVideoView = (VideoView)findViewById(R.id.myvideoview);
//myVideoView.setVideoURI(Uri.parse(SrcPath));
myVideoView.setVideoURI(Uri.parse("android.resource://" + getPackageName()     +"/"+R.raw.junglebook));
myVideoView.setMediaController(new MediaController(this));
myVideoView.requestFocus();
myVideoView.start();

VideoView myVideoView2 = (VideoView)findViewById(R.id.myvideoview2);
myVideoView2.setVideoURI(Uri.parse("android.resource://" + getPackageName() +"/"+R.raw.ringaroses));
myVideoView2.setMediaController(new MediaController(this));
myVideoView2.requestFocus();
myVideoView2.start();
}
}
 

3.) 编译并构建项目。

注意:您也可以直接从互联网流式传输和播放您的 3gp,而不是从原始文件夹中获取它,如示例所示。

输出

dualvideoview2

,

您没有详细说明您到底尝试了什么以及有哪些问题,所以我只是做了一个小测试,看看我是否可以重现您所描述的任何内容。

我没有任何结论性的发现,但至少可以确认我的 Galaxy Nexus(Android 4.0.2)能够同时播放三个视频而没有任何问题。另一方面,我躺在身边的旧三星 Galaxy Spica(Android 2.1-update1)一次只能播放一个文件 - 它似乎总是第一个 SurfaceView。

我通过为 Android 3.0、2.3.3 和 2.2 设置模拟器进一步研究了不同的 API 级别。所有这些平台似乎都能够很好地处理多个视频文件在不同表面视图上的播放。我也用运行 2.1-update1 的模拟器进行了最后一次测试,有趣的是,与实际手机不同,它也可以毫无问题地播放测试用例。不过,我确实注意到布局的呈现方式存在一些细微差别。

这种行为让我怀疑您所追求的实际上并没有任何软件限制,但这似乎取决于硬件是否支持同时播放多个视频文件。因此,对这种情况的支持因设备而异。从经验的角度来看,我绝对认为在更多物理设备上测试这个假设会很有趣。

仅供参考有关实施的一些细节:

我设置了两种稍微不同的实现:一种基于单个 Activity 中的三个 MediaPlayer 实例,另一种将它们分解为三个单独的片段,每个片段都有自己的 MediaPlayer 对象。 (顺便说一下,我没有发现这两种实现有任何播放差异) 位于资产文件夹中的单个 3gp 文件(感谢 Apple)用于与所有播放器一起播放。 两种实现的代码都附在下面,主要基于 Google 的 MediaPlayerDemo_Video 示例实现 - 我确实删除了一些实际测试不需要的代码。结果并不完整,也不适合在实时应用中使用。

基于活动的实施:

public class MultipleVideoPlayActivity extends Activity implements
OnBufferingUpdateListener,OnCompletionListener,OnPreparedListener,OnVideoSizeChangedListener,SurfaceHolder.Callback {

private static final String TAG = "MediaPlayer";
private static final int[] SURFACE_RES_IDS = { R.id.video_1_surfaceview,R.id.video_2_surfaceview,R.id.video_3_surfaceview };

private MediaPlayer[] mMediaPlayers = new MediaPlayer[SURFACE_RES_IDS.length];
private SurfaceView[] mSurfaceViews = new SurfaceView[SURFACE_RES_IDS.length];
private SurfaceHolder[] mSurfaceHolders = new SurfaceHolder[SURFACE_RES_IDS.length];
private boolean[] mSizeKnown = new boolean[SURFACE_RES_IDS.length];
private boolean[] mVideoReady = new boolean[SURFACE_RES_IDS.length];

@Override public void onCreate(Bundle icicle) {
    super.onCreate(icicle);
    setContentView(R.layout.multi_videos_layout);

    // create surface holders
    for (int i=0; i<mSurfaceViews.length; i++) {
        mSurfaceViews[i] = (SurfaceView) findViewById(SURFACE_RES_IDS[i]);
        mSurfaceHolders[i] = mSurfaceViews[i].getHolder();
        mSurfaceHolders[i].addCallback(this);
        mSurfaceHolders[i].setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }
}

public void onBufferingUpdate(MediaPlayer player,int percent) {
    Log.d(TAG,"MediaPlayer(" + indexOf(player) + "): onBufferingUpdate percent: " + percent);
}

public void onCompletion(MediaPlayer player) {
    Log.d(TAG,"MediaPlayer(" + indexOf(player) + "): onCompletion called");
}

public void onVideoSizeChanged(MediaPlayer player,int width,int height) {
    Log.v(TAG,"MediaPlayer(" + indexOf(player) + "): onVideoSizeChanged called");
    if (width == 0 || height == 0) {
        Log.e(TAG,"invalid video width(" + width + ") or height(" + height + ")");
        return;
    }

    int index = indexOf(player);
    if (index == -1) return; // sanity check; should never happen
    mSizeKnown[index] = true;
    if (mVideoReady[index] && mSizeKnown[index]) {
        startVideoPlayback(player);
    }
}

public void onPrepared(MediaPlayer player) {
    Log.d(TAG,"MediaPlayer(" + indexOf(player) + "): onPrepared called");

    int index = indexOf(player);
    if (index == -1) return; // sanity check; should never happen
    mVideoReady[index] = true;
    if (mVideoReady[index] && mSizeKnown[index]) {
        startVideoPlayback(player);
    }
}

public void surfaceChanged(SurfaceHolder holder,int i,int j,int k) {
    Log.d(TAG,"SurfaceHolder(" + indexOf(holder) + "): surfaceChanged called");
}

public void surfaceDestroyed(SurfaceHolder holder) {
    Log.d(TAG,"SurfaceHolder(" + indexOf(holder) + "): surfaceDestroyed called");
}


public void surfaceCreated(SurfaceHolder holder) {
    Log.d(TAG,"SurfaceHolder(" + indexOf(holder) + "): surfaceCreated called");

    int index = indexOf(holder);
    if (index == -1) return; // sanity check; should never happen
    try { 
        mMediaPlayers[index] = new MediaPlayer();
        AssetFileDescriptor afd = getAssets().openFd("sample.3gp");
        mMediaPlayers[index].setDataSource(afd.getFileDescriptor(),afd.getStartOffset(),afd.getLength()); 
        mMediaPlayers[index].setDisplay(mSurfaceHolders[index]);
        mMediaPlayers[index].prepare();
        mMediaPlayers[index].setOnBufferingUpdateListener(this);
        mMediaPlayers[index].setOnCompletionListener(this);
        mMediaPlayers[index].setOnPreparedListener(this);
        mMediaPlayers[index].setOnVideoSizeChangedListener(this);
        mMediaPlayers[index].setAudioStreamType(AudioManager.STREAM_MUSIC);
    }
    catch (Exception e) { e.printStackTrace(); }
}

@Override protected void onPause() {
    super.onPause();
    releaseMediaPlayers();
}

@Override protected void onDestroy() {
    super.onDestroy();
    releaseMediaPlayers();
}

private void releaseMediaPlayers() {
    for (int i=0; i<mMediaPlayers.length; i++) {
        if (mMediaPlayers[i] != null) {
            mMediaPlayers[i].release();
            mMediaPlayers[i] = null;
        }
    }
}


private void startVideoPlayback(MediaPlayer player) {
    Log.v(TAG,"MediaPlayer(" + indexOf(player) + "): startVideoPlayback");
    player.start();
}

private int indexOf(MediaPlayer player) {
    for (int i=0; i<mMediaPlayers.length; i++) if (mMediaPlayers[i] == player) return i;
    return -1;  
}

private int indexOf(SurfaceHolder holder) {
    for (int i=0; i<mSurfaceHolders.length; i++) if (mSurfaceHolders[i] == holder) return i;
    return -1;  
}
}

R.layout.multi_videos_layout:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">

<SurfaceView android:id="@+id/video_1_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

<SurfaceView android:id="@+id/video_2_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

<SurfaceView android:id="@+id/video_3_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

 </LinearLayout>

基于片段的实现:

public class MultipleVideoPlayFragmentActivity extends FragmentActivity {

private static final String TAG = "MediaPlayer";

@Override public void onCreate(Bundle icicle) {
    super.onCreate(icicle);
    setContentView(R.layout.multi_videos_activity_layout);
}

public static class VideoFragment extends Fragment implements
    OnBufferingUpdateListener,SurfaceHolder.Callback {

    private MediaPlayer mMediaPlayer;
    private SurfaceView mSurfaceView;
    private SurfaceHolder mSurfaceHolder;
    private boolean mSizeKnown;
    private boolean mVideoReady;

    @Override public View onCreateView(LayoutInflater inflater,ViewGroup container,Bundle savedInstanceState) {
        return inflater.inflate(R.layout.multi_videos_fragment_layout,container,false);
    }

    @Override public void onActivityCreated(Bundle savedInstanceState) {
        super.onActivityCreated(savedInstanceState);
        mSurfaceView = (SurfaceView) getView().findViewById(R.id.video_surfaceview);
        mSurfaceHolder = mSurfaceView.getHolder();
        mSurfaceHolder.addCallback(this);
        mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }

    public void onBufferingUpdate(MediaPlayer player,int percent) {
        Log.d(TAG,"onBufferingUpdate percent: " + percent);
    }

    public void onCompletion(MediaPlayer player) {
        Log.d(TAG,"onCompletion called");
    }

    public void onVideoSizeChanged(MediaPlayer player,int height) {
        Log.v(TAG,"onVideoSizeChanged called");
        if (width == 0 || height == 0) {
            Log.e(TAG,"invalid video width(" + width + ") or height(" + height + ")");
            return;
        }

        mSizeKnown = true;
        if (mVideoReady && mSizeKnown) {
            startVideoPlayback();
        }
    }

    public void onPrepared(MediaPlayer player) {
        Log.d(TAG,"onPrepared called");

        mVideoReady = true;
        if (mVideoReady && mSizeKnown) {
            startVideoPlayback();
        }
    }

    public void surfaceChanged(SurfaceHolder holder,int k) {
        Log.d(TAG,"surfaceChanged called");
    }

    public void surfaceDestroyed(SurfaceHolder holder) {
        Log.d(TAG,"surfaceDestroyed called");
    }

    public void surfaceCreated(SurfaceHolder holder) {
        Log.d(TAG,"surfaceCreated called");

        try { 
            mMediaPlayer = new MediaPlayer();
            AssetFileDescriptor afd = getActivity().getAssets().openFd("sample.3gp");
            mMediaPlayer.setDataSource(afd.getFileDescriptor(),afd.getLength()); 
            mMediaPlayer.setDisplay(mSurfaceHolder);
            mMediaPlayer.prepare();
            mMediaPlayer.setOnBufferingUpdateListener(this);
            mMediaPlayer.setOnCompletionListener(this);
            mMediaPlayer.setOnPreparedListener(this);
            mMediaPlayer.setOnVideoSizeChangedListener(this);
            mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
        }
        catch (Exception e) { e.printStackTrace(); }
    }

    @Override public void onPause() {
        super.onPause();
        releaseMediaPlayer();
    }

    @Override public void onDestroy() {
        super.onDestroy();
        releaseMediaPlayer();
    }

    private void releaseMediaPlayer() {
        if (mMediaPlayer != null) {
            mMediaPlayer.release();
            mMediaPlayer = null;
        }
    }

    private void startVideoPlayback() {
        Log.v(TAG,"startVideoPlayback");
        mMediaPlayer.start();
    }
}
}

R.layout.multi_videos_activity_layout:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_1_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_2_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_3_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

</LinearLayout>

R.layout.multi_videos_fragment_layout:

<?xml version="1.0" encoding="utf-8"?>
<SurfaceView xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/video_surfaceview" android:layout_width="fill_parent"
android:layout_height="fill_parent" />

更新: 虽然它已经存在一段时间了,但我认为值得指出的是 Google's Grafika project 展示了一个 { {3}} 功能,“将两个视频流同时解码为两个 TextureView。”。不确定它扩展到两个以上视频文件的效果如何,但仍然与原始问题相关。

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。