本文介绍了MediaCodec和Camera:颜色空间不匹配的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述 29岁程序员,3月因学历无情被辞! 我一直在尝试使用H264编码来处理由Android平板电脑上的相机捕获的输入,使用新的低级 MediaCodec 。我遇到了一些困难,因为MediaCodecAPI的记录不好,但我已经得到了一些工作,最后。 我把相机设置为如下: Camera.Parameters parameters = mCamera.getParameters parameter.setPreviewFormat(ImageFormat.YV12); //< 1> parameter.setPreviewFpsRange(4000,60000); parameter.setPreviewSize(640,480); mCamera.setParameters(parameters); 对于编码部分,我实例化MediaCodec对象如下: mediaCodec = MediaCodec.createEncoderByType(video / avc); MediaFormat mediaFormat = MediaFormat.createVideoFormat(video / avc,640,480); mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE,500000); mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE,15); mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar); //< 2> mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,5); mediaCodec.configure(mediaFormat,null,null,MediaCodec.CONFIGURE_FLAG_ENCODE); mediaCodec.start(); 最终的目标是创建一个RTP流(对应于Skype)我只是将原始H264直接流到我的桌面。在这里我使用以下GStreamer-pipeline显示结果: gst-launch udpsrc port = 5555! video / x-h264,width = 640,height = 480,framerate = 15/1! ffdec_h264! autovideosink 所有效果都很好,除了颜色。我需要在计算机中设置2个颜色格式:一个用于相机预览(标记为< 1> 的行),一个用于MediaCodec对象c $ c>< 2> ) 要确定< 1& / code>我使用 parameters.getSupportedPreviewFormats()。从这里,我知道相机上唯一支持的格式是 ImageFormat。 NV21 和 ImageFormat.YV2 。 p> 对于< 2> ,我检索了 MediaCodecInfo.CodecCapabilities - 类型为 video / avc 的对象,为整数值19(对应 MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar 和2130708361(其中' t对应于 MediaCodecInfo.CodecCapabilities 的任何值。 任何其他值都会导致崩溃。 组合这些设置会产生不同的结果,将在下面显示。这里是Android上的屏幕截图(即真正的颜色): 以下是Gstreamer显示的结果: < 1> = NV21,< 2> = COLOR_FormatYUV420Planar < 1> = NV21,< code> = 2130708361 < 1> = YV2,< 2> = COLOR_FormatYUV420Planar < 1> = YV2,< 2> = 2130708361 可以看出,满意。 YV2色彩空间看起来最有前途,但它看起来像红色(Cr)和蓝色(Cb)是颠倒的。 NV21看起来是隔行扫描的(但我不是这方面的专家)。 由于目的是为了与Skype沟通,我认为我不应该改变解码器(即Gstreamer命令),对吧?这是要在Android中解决,如果是这样:如何?或者可以通过添加某些RTP有效载荷信息来解决?任何其他建议?解决方案我通过使用一个简单的函数在Android级别交换byteplanes来解决它: public byte [] swapYV12toI420(byte [] yv12bytes,int width,int height){ byte [] i420bytes = new byte [ yv12bytes.length]; for(int i = 0; i i420bytes [i] = yv12bytes [i] for(int i = width * height; i i420bytes [i] = yv12bytes [i + height / 2)]; for(int i = width * height +(width / 2 * height / 2); i i420bytes [ i] = yv12bytes [i-(width / 2 * height / 2)]; return i420bytes; } I have been trying to get H264 encoding to work with input captured by the camera on an Android tablet using the new low-level MediaCodec. I have gone through some difficulties with this, since the MediaCodecAPI is poorly documented, but I've gotten something to work at last.I'm setting up the camera as follows: Camera.Parameters parameters = mCamera.getParameters(); parameters.setPreviewFormat(ImageFormat.YV12); // <1> parameters.setPreviewFpsRange(4000,60000); parameters.setPreviewSize(640, 480); mCamera.setParameters(parameters);For the encoding part, I'm instantiating the MediaCodec object as follows: mediaCodec = MediaCodec.createEncoderByType("video/avc"); MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 640, 480); mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 500000); mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15); mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar); // <2> mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5); mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE); mediaCodec.start();The final goal is to create an RTP-stream (and correspond with Skype), but so far I am only streaming the raw H264 directly to my desktop. There I use the following GStreamer-pipeline to show the result:gst-launch udpsrc port=5555 ! video/x-h264,width=640,height=480,framerate=15/1 ! ffdec_h264 ! autovideosinkAll works well, except for the colors. I need to set 2 colorformats in the computer: one for the camera-preview (line tagged with <1>) and one for the MediaCodec-object (tagged with <2>)To determine the acceptable values for the lines <1> I used parameters.getSupportedPreviewFormats(). From this, I know that the only supported formats on the camera are ImageFormat.NV21 and ImageFormat.YV2.For <2>, I retrieved the MediaCodecInfo.CodecCapabilities-object for type video/avc, being the integer values 19 (corresponding with MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar and 2130708361 (which doesn't correspond with any value of MediaCodecInfo.CodecCapabilities).Any other value than the above results in a crash.Combining these settings gives different results, which I'll show below. Here's the screenshot on Android (i.e. the "real" colors):Here are the results as shown by Gstreamer:<1> = NV21, <2> = COLOR_FormatYUV420Planar<1> = NV21, <2> = 2130708361<1> = YV2, <2> = COLOR_FormatYUV420Planar<1> = YV2, <2> = 2130708361As can be seen, none of these are satisfying. The YV2-colorspace looks the most promising, but it looks like red (Cr) and blue (Cb) are inverted. The NV21 looks interlaced I guess (however, I'm no expert in this field).Since the purpose is to communicate with Skype, I assume I shouldn't change the decoder (i.e. the Gstreamer-command), right? Is this to be solved in Android and if so: how? Or can this be solved by adding certain RTP payload information? Any other suggestion? 解决方案 I solved it by swapping the byteplanes myself on Android level, using a simple function:public byte[] swapYV12toI420(byte[] yv12bytes, int width, int height) { byte[] i420bytes = new byte[yv12bytes.length]; for (int i = 0; i < width*height; i++) i420bytes[i] = yv12bytes[i]; for (int i = width*height; i < width*height + (width/2*height/2); i++) i420bytes[i] = yv12bytes[i + (width/2*height/2)]; for (int i = width*height + (width/2*height/2); i < width*height + 2*(width/2*height/2); i++) i420bytes[i] = yv12bytes[i - (width/2*height/2)]; return i420bytes;} 这篇关于MediaCodec和Camera:颜色空间不匹配的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持! 上岸,阿里云!
07-23 02:45
查看更多