在android中,我创建了一个三个面视图并排的布局,我想用不同的媒体播放器同时播放一个视频文件。但我面临的一个问题是三个人都不能同时播放这段视频。他们中的一两个被阻止了。
如果我直接使用视频视图而不是媒体播放器类,但问题仍然存在。
请大家帮忙。问题出在哪里?它给出错误表面创建失败的本机错误。我尝试了不同的组合,比如一个文件在三个不同的视图中,三个文件在三个不同的视图中,但是问题还没有解决。
其他网站的一些回复说,这取决于内核版本。
如果这取决于内核版本,请你给我任何Android网站上的Android文档链接,它取决于内核版本。或者可以玩,请给我代码的步骤。这是错误日志-
04-10 19:23:37.995: E/ANDROID_DRM_TEST(2573): Client::notify In
04-10 19:23:37.995: V/AudioPolicyManager(2573): startOutput() output 1, stream 3, session 131
04-10 19:23:37.995: V/AudioPolicyManager(2573): getDeviceForStrategy() from cache strategy 0, device 2
04-10 19:23:37.995: V/AudioPolicyManager(2573): getNewDevice() selected device 2
04-10 19:23:37.995: V/AudioPolicyManager(2573): setOutputDevice() output 1 device 2 delayMs 0
04-10 19:23:37.995: V/AudioPolicyManager(2573): setOutputDevice() setting same device 2 or null device for output 1
04-10 19:23:37.995: I/AudioFlinger(2573): start output streamType (0, 3) for 1
04-10 19:23:37.995: D/AudioHardwareYamaha(2573): AudioStreamOut::setParameters(keyValuePairs="start_output_streamtype=3")
04-10 19:23:38.010: W/SEC_Overlay(2689): overlay_setPosition(0) 0,0,200,397 => 0,0,200,397
04-10 19:23:38.010: I/SEC_Overlay(2689): overlay_setParameter param[4]=4
04-10 19:23:38.010: D/SEC_Overlay(2689): dst width, height have changed [w= 200, h= 397] -> [w=200, h= 397]
04-10 19:23:38.010: I/SEC_Overlay(2689): Nothing to do!
04-10 19:23:38.090: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync() VIDEO ROTATION 0
04-10 19:23:38.090: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync() VIDEO RENDERER 1
04-10 19:23:38.090: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.090: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.090: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.195: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.195: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.195: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.230: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync() VIDEO ROTATION 0
04-10 19:23:38.230: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync() VIDEO RENDERER 1
04-10 19:23:38.230: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.230: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.230: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.295: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.295: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.295: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.330: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.330: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.330: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.395: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.395: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.395: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.435: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.435: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.435: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.495: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.495: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.495: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.535: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
最佳答案
你没有给出太多关于你到底做了什么和有问题的地方的细节,所以我只是做了一个小测试,看看我是否可以复制你所描述的任何东西。
我没有任何结论性的发现,但至少可以确认我的galaxy nexus(android 4.0.2)能够同时播放三段视频,没有任何问题。另一方面,我躺在那里的一个旧的三星galaxy spica(android 2.1-update1)一次只能播放一个文件——它似乎总是第一个。
通过为android 3.0、2.3.3和2.2设置模拟器,我进一步研究了不同的api级别。所有这些平台似乎都能够很好地将多个视频文件回放到不同的表面视图上。我用一个运行2.1-update1的仿真器做了最后一个测试,有趣的是,这个仿真器和实际的手机不同,也没有问题地运行了这个测试用例。不过,我确实注意到了布局呈现方式上的一些细微差别。
这种行为让我怀疑你所追求的并没有任何软件限制,但似乎取决于硬件是否支持同时播放多个视频文件。因此,对该方案的支持将因设备而异。从电学的角度来看,我绝对认为在更多的物理设备上测试这个假设是有趣的。
仅供参考,关于实现的一些细节:
我设置了两个稍微不同的实现:一个基于单个SurfaceView
中的三个MediaPlayer
实例,另一个将它们分解为三个单独的片段,每个片段都有自己的Activity
对象。(顺便说一下,我没有发现这两种实现的回放有任何不同)
位于MediaPlayer
文件夹中的一个3gp file(感谢您,apple)用于所有播放器的播放。
这两个实现的代码都附在下面,主要基于googlesassets
示例实现-我确实去掉了一些实际测试不需要的代码。其结果绝不是完整的,也不适合在实时应用程序中使用。
基于活动的实施:
public class MultipleVideoPlayActivity extends Activity implements
OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {
private static final String TAG = "MediaPlayer";
private static final int[] SURFACE_RES_IDS = { R.id.video_1_surfaceview, R.id.video_2_surfaceview, R.id.video_3_surfaceview };
private MediaPlayer[] mMediaPlayers = new MediaPlayer[SURFACE_RES_IDS.length];
private SurfaceView[] mSurfaceViews = new SurfaceView[SURFACE_RES_IDS.length];
private SurfaceHolder[] mSurfaceHolders = new SurfaceHolder[SURFACE_RES_IDS.length];
private boolean[] mSizeKnown = new boolean[SURFACE_RES_IDS.length];
private boolean[] mVideoReady = new boolean[SURFACE_RES_IDS.length];
@Override public void onCreate(Bundle icicle) {
super.onCreate(icicle);
setContentView(R.layout.multi_videos_layout);
// create surface holders
for (int i=0; i<mSurfaceViews.length; i++) {
mSurfaceViews[i] = (SurfaceView) findViewById(SURFACE_RES_IDS[i]);
mSurfaceHolders[i] = mSurfaceViews[i].getHolder();
mSurfaceHolders[i].addCallback(this);
mSurfaceHolders[i].setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
}
public void onBufferingUpdate(MediaPlayer player, int percent) {
Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onBufferingUpdate percent: " + percent);
}
public void onCompletion(MediaPlayer player) {
Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onCompletion called");
}
public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): onVideoSizeChanged called");
if (width == 0 || height == 0) {
Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
return;
}
int index = indexOf(player);
if (index == -1) return; // sanity check; should never happen
mSizeKnown[index] = true;
if (mVideoReady[index] && mSizeKnown[index]) {
startVideoPlayback(player);
}
}
public void onPrepared(MediaPlayer player) {
Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onPrepared called");
int index = indexOf(player);
if (index == -1) return; // sanity check; should never happen
mVideoReady[index] = true;
if (mVideoReady[index] && mSizeKnown[index]) {
startVideoPlayback(player);
}
}
public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceChanged called");
}
public void surfaceDestroyed(SurfaceHolder holder) {
Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceDestroyed called");
}
public void surfaceCreated(SurfaceHolder holder) {
Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceCreated called");
int index = indexOf(holder);
if (index == -1) return; // sanity check; should never happen
try {
mMediaPlayers[index] = new MediaPlayer();
AssetFileDescriptor afd = getAssets().openFd("sample.3gp");
mMediaPlayers[index].setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
mMediaPlayers[index].setDisplay(mSurfaceHolders[index]);
mMediaPlayers[index].prepare();
mMediaPlayers[index].setOnBufferingUpdateListener(this);
mMediaPlayers[index].setOnCompletionListener(this);
mMediaPlayers[index].setOnPreparedListener(this);
mMediaPlayers[index].setOnVideoSizeChangedListener(this);
mMediaPlayers[index].setAudioStreamType(AudioManager.STREAM_MUSIC);
}
catch (Exception e) { e.printStackTrace(); }
}
@Override protected void onPause() {
super.onPause();
releaseMediaPlayers();
}
@Override protected void onDestroy() {
super.onDestroy();
releaseMediaPlayers();
}
private void releaseMediaPlayers() {
for (int i=0; i<mMediaPlayers.length; i++) {
if (mMediaPlayers[i] != null) {
mMediaPlayers[i].release();
mMediaPlayers[i] = null;
}
}
}
private void startVideoPlayback(MediaPlayer player) {
Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): startVideoPlayback");
player.start();
}
private int indexOf(MediaPlayer player) {
for (int i=0; i<mMediaPlayers.length; i++) if (mMediaPlayers[i] == player) return i;
return -1;
}
private int indexOf(SurfaceHolder holder) {
for (int i=0; i<mSurfaceHolders.length; i++) if (mSurfaceHolders[i] == holder) return i;
return -1;
}
}
r.layout.multi_videos_布局:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">
<SurfaceView android:id="@+id/video_1_surfaceview"
android:layout_width="fill_parent" android:layout_height="0dp"
android:layout_weight="1" />
<SurfaceView android:id="@+id/video_2_surfaceview"
android:layout_width="fill_parent" android:layout_height="0dp"
android:layout_weight="1" />
<SurfaceView android:id="@+id/video_3_surfaceview"
android:layout_width="fill_parent" android:layout_height="0dp"
android:layout_weight="1" />
</LinearLayout>
基于片段的实现:
public class MultipleVideoPlayFragmentActivity extends FragmentActivity {
private static final String TAG = "MediaPlayer";
@Override public void onCreate(Bundle icicle) {
super.onCreate(icicle);
setContentView(R.layout.multi_videos_activity_layout);
}
public static class VideoFragment extends Fragment implements
OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {
private MediaPlayer mMediaPlayer;
private SurfaceView mSurfaceView;
private SurfaceHolder mSurfaceHolder;
private boolean mSizeKnown;
private boolean mVideoReady;
@Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
return inflater.inflate(R.layout.multi_videos_fragment_layout, container, false);
}
@Override public void onActivityCreated(Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);
mSurfaceView = (SurfaceView) getView().findViewById(R.id.video_surfaceview);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void onBufferingUpdate(MediaPlayer player, int percent) {
Log.d(TAG, "onBufferingUpdate percent: " + percent);
}
public void onCompletion(MediaPlayer player) {
Log.d(TAG, "onCompletion called");
}
public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
Log.v(TAG, "onVideoSizeChanged called");
if (width == 0 || height == 0) {
Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
return;
}
mSizeKnown = true;
if (mVideoReady && mSizeKnown) {
startVideoPlayback();
}
}
public void onPrepared(MediaPlayer player) {
Log.d(TAG, "onPrepared called");
mVideoReady = true;
if (mVideoReady && mSizeKnown) {
startVideoPlayback();
}
}
public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
Log.d(TAG, "surfaceChanged called");
}
public void surfaceDestroyed(SurfaceHolder holder) {
Log.d(TAG, "surfaceDestroyed called");
}
public void surfaceCreated(SurfaceHolder holder) {
Log.d(TAG, "surfaceCreated called");
try {
mMediaPlayer = new MediaPlayer();
AssetFileDescriptor afd = getActivity().getAssets().openFd("sample.3gp");
mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
mMediaPlayer.setDisplay(mSurfaceHolder);
mMediaPlayer.prepare();
mMediaPlayer.setOnBufferingUpdateListener(this);
mMediaPlayer.setOnCompletionListener(this);
mMediaPlayer.setOnPreparedListener(this);
mMediaPlayer.setOnVideoSizeChangedListener(this);
mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
}
catch (Exception e) { e.printStackTrace(); }
}
@Override public void onPause() {
super.onPause();
releaseMediaPlayer();
}
@Override public void onDestroy() {
super.onDestroy();
releaseMediaPlayer();
}
private void releaseMediaPlayer() {
if (mMediaPlayer != null) {
mMediaPlayer.release();
mMediaPlayer = null;
}
}
private void startVideoPlayback() {
Log.v(TAG, "startVideoPlayback");
mMediaPlayer.start();
}
}
}
r.layout.multi_videos_activity_布局:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">
<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
android:id="@+id/video_1_fragment" android:layout_width="fill_parent"
android:layout_height="0dp" android:layout_weight="1" />
<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
android:id="@+id/video_2_fragment" android:layout_width="fill_parent"
android:layout_height="0dp" android:layout_weight="1" />
<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
android:id="@+id/video_3_fragment" android:layout_width="fill_parent"
android:layout_height="0dp" android:layout_weight="1" />
</LinearLayout>
r.layout.multi_videos_fragment_布局:
<?xml version="1.0" encoding="utf-8"?>
<SurfaceView xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/video_surfaceview" android:layout_width="fill_parent"
android:layout_height="fill_parent" />
更新:虽然它已经存在了一段时间,但我只是认为值得指出的是,google的Grafika project展示了一个'double decode'功能,它“将两个视频流同时解码为两个textureviews”。不确定它扩展到两个以上的视频文件有多好,但仍然与最初的问题相关。