考虑到以下逻辑(据我所知),我编写了从YUV_420_888到位图的转换:
总结一下方法:内核的坐标x和y与Y平面(2d分配)的未填充部分的x和y以及输出位图的x和y都相同。但是,U平面和V平面的结构与Y平面不同,因为它们使用1个字节来覆盖4个像素,此外,它们的PixelStride可能不止一个,此外,也有可能与Y平面不同的填充。因此,为了通过内核高效地访问U和V,我将它们放入1-d分配中,并创建了一个索引“uvIndex”,该索引给出了该1-d分配中相应U-和V的位置,对于给定( x,y)在(非填充的)Y平面中(因此,在输出位图上)坐标。
为了使rs-Kernel保持精简,我通过LaunchOptions限制了x范围,从而排除了yPlane中的填充区域(这反射(reflect)了y平面的RowStride,因此可以在内核中忽略它)。因此我们只需要考虑uvIndex中的uvPixelStride和uvRowStride,即用于访问u值和v值的索引。
这是我的代码:
Renderscript内核,名为yuv420888.rs
#pragma version(1)
#pragma rs java_package_name(com.xxxyyy.testcamera2);
#pragma rs_fp_relaxed
int32_t width;
int32_t height;
uint picWidth, uvPixelStride, uvRowStride ;
rs_allocation ypsIn,uIn,vIn;
// The LaunchOptions ensure that the Kernel does not enter the padding zone of Y, so yRowStride can be ignored WITHIN the Kernel.
uchar4 __attribute__((kernel)) doConvert(uint32_t x, uint32_t y) {
// index for accessing the uIn's and vIn's
uint uvIndex= uvPixelStride * (x/2) + uvRowStride*(y/2);
// get the y,u,v values
uchar yps= rsGetElementAt_uchar(ypsIn, x, y);
uchar u= rsGetElementAt_uchar(uIn, uvIndex);
uchar v= rsGetElementAt_uchar(vIn, uvIndex);
// calc argb
int4 argb;
argb.r = yps + v * 1436 / 1024 - 179;
argb.g = yps -u * 46549 / 131072 + 44 -v * 93604 / 131072 + 91;
argb.b = yps +u * 1814 / 1024 - 227;
argb.a = 255;
uchar4 out = convert_uchar4(clamp(argb, 0, 255));
return out;
}
Java方面:
private Bitmap YUV_420_888_toRGB(Image image, int width, int height){
// Get the three image planes
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
byte[] y = new byte[buffer.remaining()];
buffer.get(y);
buffer = planes[1].getBuffer();
byte[] u = new byte[buffer.remaining()];
buffer.get(u);
buffer = planes[2].getBuffer();
byte[] v = new byte[buffer.remaining()];
buffer.get(v);
// get the relevant RowStrides and PixelStrides
// (we know from documentation that PixelStride is 1 for y)
int yRowStride= planes[0].getRowStride();
int uvRowStride= planes[1].getRowStride(); // we know from documentation that RowStride is the same for u and v.
int uvPixelStride= planes[1].getPixelStride(); // we know from documentation that PixelStride is the same for u and v.
// rs creation just for demo. Create rs just once in onCreate and use it again.
RenderScript rs = RenderScript.create(this);
//RenderScript rs = MainActivity.rs;
ScriptC_yuv420888 mYuv420=new ScriptC_yuv420888 (rs);
// Y,U,V are defined as global allocations, the out-Allocation is the Bitmap.
// Note also that uAlloc and vAlloc are 1-dimensional while yAlloc is 2-dimensional.
Type.Builder typeUcharY = new Type.Builder(rs, Element.U8(rs));
typeUcharY.setX(yRowStride).setY(height);
Allocation yAlloc = Allocation.createTyped(rs, typeUcharY.create());
yAlloc.copyFrom(y);
mYuv420.set_ypsIn(yAlloc);
Type.Builder typeUcharUV = new Type.Builder(rs, Element.U8(rs));
// note that the size of the u's and v's are as follows:
// ( (width/2)*PixelStride + padding ) * (height/2)
// = (RowStride ) * (height/2)
// but I noted that on the S7 it is 1 less...
typeUcharUV.setX(u.length);
Allocation uAlloc = Allocation.createTyped(rs, typeUcharUV.create());
uAlloc.copyFrom(u);
mYuv420.set_uIn(uAlloc);
Allocation vAlloc = Allocation.createTyped(rs, typeUcharUV.create());
vAlloc.copyFrom(v);
mYuv420.set_vIn(vAlloc);
// handover parameters
mYuv420.set_picWidth(width);
mYuv420.set_uvRowStride (uvRowStride);
mYuv420.set_uvPixelStride (uvPixelStride);
Bitmap outBitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
Allocation outAlloc = Allocation.createFromBitmap(rs, outBitmap, Allocation.MipmapControl.MIPMAP_NONE, Allocation.USAGE_SCRIPT);
Script.LaunchOptions lo = new Script.LaunchOptions();
lo.setX(0, width); // by this we ignore the y’s padding zone, i.e. the right side of x between width and yRowStride
lo.setY(0, height);
mYuv420.forEach_doConvert(outAlloc,lo);
outAlloc.copyTo(outBitmap);
return outBitmap;
}
在Nexus 7(API 22)上进行测试,这会返回漂亮的彩色位图。但是,此设备的像素点很小(= 1),没有填充(即rowtride = width)。在全新的Samsung S7(API 23)上进行测试,我得到的图片颜色不正确-绿色除外。但是,图片未显示出对绿色的一般偏见,似乎非正确复制的颜色不是绿色。请注意,S7应用了2的u/v pixeltride,并且没有填充。
由于最关键的代码行在rs代码内,因此我认为u/v平面的访问uint uvIndex =(...)可能存在问题,可能是由于此处不正确地考虑了pixeltrides。有人看到解决方案了吗?谢谢。
更新:我检查了所有内容,并且我很确定有关y,u,v的访问的代码是正确的。因此问题必须出在u和v值本身上。非绿色具有紫色倾斜,从u,v值看,它们似乎在110-150左右的较窄范围内。我们是否真的有可能需要处理特定于设备的YUV-> RBG转换...?我有什么想念的吗?
更新2:感谢Eddy的反馈,已经更正了代码,现在可以正常工作。
最佳答案
看着
floor((float) uvPixelStride*(x)/2)
它根据Y x坐标计算您的U,V行偏移量(uv_row_offset)。
如果uvPixelStride = 2,则随着x的增加:
x = 0, uv_row_offset = 0
x = 1, uv_row_offset = 1
x = 2, uv_row_offset = 2
x = 3, uv_row_offset = 3
这是不正确的。由于uvPixelStride = 2,因此在uv_row_offset = 1或3时没有有效的U/V像素值。
你要
uvPixelStride * floor(x/2)
(假设您不相信自己会记住整数除法的关键舍入行为,如果这样做的话):
uvPixelStride * (x/2)
应该足够了
这样,您的映射将变为:
x = 0, uv_row_offset = 0
x = 1, uv_row_offset = 0
x = 2, uv_row_offset = 2
x = 3, uv_row_offset = 2
查看是否可以解决颜色错误。实际上,这里不正确的寻址将意味着所有其他颜色样本都将来自错误的色彩平面,因为底层的YUV数据很可能是半平面的(因此,U平面从V平面+ 1字节开始,两个平面是交错的)
关于android - 在三星Galaxy S7(Camera2)上的YUV_420_888解释,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/36212904/