问题描述
背景:
我一直在努力实现像录像机中的藤两天。首先,我试过MediaRecorder。但我需要的视频可能是小的视频片段组成。这个类不能被用于记录一个短时视频剪辑。然后我发现媒体codeC,FFmpeg的和JavaCV。 FFmpeg的和JavaCV可以解决这个问题。但我编译我的项目有许多库文件。它会产生一个非常大的APK文件。所以我preFER由媒体codeC实现它,虽然只有这个类可以是Android 4.1后才能使用。 90%的用户百分比将得到满足。
结果:
我终于得到了EN codeD文件,但它不能播放。我检查由FFprobe的信息,结果是这样的:
I do not know much about the mechanism of H.264 coding.
CODE:
Modified from this link
public class AvcEncoder {
private static String TAG = AvcEncoder.class.getSimpleName();
private MediaCodec mediaCodec;
private BufferedOutputStream outputStream;
private int mWidth, mHeight;
private byte[] mDestData;
public AvcEncoder(int w, int h) {
mWidth = w;
mHeight = h;
Log.d(TAG, "Thread Id: " + Thread.currentThread().getId());
File f = new File("/sdcard/videos/test.mp4");
try {
outputStream = new BufferedOutputStream(new FileOutputStream(f));
Log.i("AvcEncoder", "outputStream initialized");
} catch (Exception e) {
e.printStackTrace();
}
try {
mediaCodec = MediaCodec.createEncoderByType("video/avc");
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", w,
h);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 2000000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
// mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
// MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mDestData = new byte[w * h
* ImageFormat.getBitsPerPixel(ImageFormat.YV12) / 8];
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mediaCodec.configure(mediaFormat, null, null,
MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
}
public void close() {
try {
mediaCodec.stop();
mediaCodec.release();
mediaCodec = null;
// outputStream.flush();
outputStream.close();
} catch (IOException e) {
}
}
public void offerEncoder(byte[] input) {
try {
CameraUtils.transYV12toYUV420Planar(input, mDestData, mWidth,
mHeight);
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(mDestData);
mediaCodec.queueInputBuffer(inputBufferIndex, 0,
mDestData.length, 0, 0);
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,
0);
while (outputBufferIndex >= 0) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
try {
outputStream.write(outData, 0, outData.length);
} catch (Exception e) {
Log.d("AvcEncoder", "Outputstream write failed");
e.printStackTrace();
}
// Log.i("AvcEncoder", outData.length + " bytes written");
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,
0);
}
} catch (Throwable t) {
t.printStackTrace();
}
}
}
Invoke this class by Camera's startPreview:
private void startPreview() {
if (mCamera == null) {
return;
}
try {
mCamera.setPreviewDisplay(mSurfaceView.getHolder());
Parameters p = mCamera.getParameters();
Size s = p.getPreviewSize();
int len = s.width * s.height
* ImageFormat.getBitsPerPixel(p.getPreviewFormat()) / 8;
mAvcEncoder = new AvcEncoder(s.width, s.height);
mCamera.addCallbackBuffer(new byte[len]);
mCamera.setPreviewCallbackWithBuffer(new PreviewCallback() {
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
mAvcEncoder.offerEncoder(data);
mCamera.addCallbackBuffer(data);
}
});
mCamera.startPreview();
} catch (IOException e) {
e.printStackTrace();
}
}
Close it when release Camera:
private void releaseCamera() {
if (mCamera != null) {
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
if (mAvcEncoder != null) {
mAvcEncoder.close();
}
}
You're saving a raw H.264 stream. You should convert it to .mp4 format. The easiest way to do this is with the MediaMuxer class (API 18+).
You can find a simple example on bigflake and more complete examples in Grafika.
You will need to provide presentation time stamps for each frame. You can either generate them according to your desired frame rate (like the bigflake example) or acquire them from the source (like the camera-input examples in Grafika).
Edit: For pre-API-18 devices (Android 4.1/4.2), MediaCodec is much more difficult to work with. You can't use Surface input or MediaMuxer, and the lack of platform tests led to some unfortunate incompatibilities. This answer has an overview.
In your specific case, I will note that your sample code is attempting to specify the input format, but that has no effect -- the AVC codec defines what input formats it accepts, and your app must query for it. You will likely find that the colors in your encoded video are currently wrong, as the Camera and MediaCodec don't have any color formats in common (see that answer for color-swap code).
这篇关于在Android的媒体codeC的H.264 AVC视频连接$ C $光盘无法播放的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!