转换摄像机的YUV数据ARGB与renderscript摄像机、数据、renderscript、YUV

2023-09-05 02:14:35 作者:笑傲红尘

这是这里的计算器我的第一个问题。

我的问题是:我已经设置了摄像头的Andr​​oid和通过previewFrame,监听器,通过我包含在默认的Andr​​oid图像数据的byte []数组使用收到preVIEW数据YUV格式(设备不支持R5G6B5-格式)。每个像素由12位,这使得事情有点棘手。现在,我想要做的是转换的YUV数据转化为ARGB数据,以便做图像处理它。这必须与renderscript进行,以维持高的性能。

我的想法是通过两个像素中的一个元素(这将是24比特= 3字节),然后返回两个ARGB像素。的问题是,在一个Renderscript u8_3(一个3维8位矢量)被存储在32位,这意味着,最后8位未使用。但图像数据复制到分配时的所有32位都使用,所以,最后8位迷路。即使我用一个32位的输入数据,最后8位是无用的,因为他们只有一个像素的2/3。当定义一个元素组成的3字节阵列,它实际上具有3个字节的实际尺寸。但随后的Allocation.copyFrom() - 。方法没有填充在分配了数据,argueing它不具有正确的数据类型来填充有字节[]

在renderscript文档状态,有一个ScriptIntrinsicYuvToRGB应该做的正是在API级别17但实际上类不存在。我已经下载了API等级17,即使它似乎并没有被下载了。有没有人有任何有关它的信息?有没有人曾经尝试了ScriptIntrinsic?

所以在最后我的问题是:如何快速的相机数据转换成ARGB数据,hardwareaccelerated

这是如何做到这一点的Dalvik虚拟机(找到了code网上的某个地方,它的工作原理):

  @燮pressWarnings(未使用)
私人无效德codeYUV420SP(INT [] RGB,byte []的yuv420sp,诠释的宽度,高度INT){
    最终诠释框架尺寸=宽*高;
    对于(INT J = 0,YP = 0; J<高度; J ++){
        INT UVP =框架尺寸+(J>→1)*宽度,U = 0,V = 0;
        的for(int i = 0; I<宽度;我++,YP ++){
            INT Y =(0xFF的及((int)的yuv420sp [YP])) -  16;
            如果(γ℃的)
                Y = 0;
            如果((ⅰ&安培; 1)== 0){
                V =(0xFF的&放大器; yuv420sp [UVP ++]) -  128;
                U =(0xFF的&放大器; yuv420sp [UVP ++]) -  128;
            }
            INT y1192 = 1192 * Y;
            INT R =(y1192 + 1634 * V);
            INT G =(y1192  -  833 * V  -  400 * U);
            INT B =(y1192 + 2066 * U);
            如果(为r 0)
                使r = 0;
            否则,如果(R> 262143)
                R = 262143;
            如果(克℃,)
                克= 0;
            否则,如果(G> 262143)
                G = 262143;
            如果(b将0)
                B = 0;
            否则,如果(B> 262143)
                B = 262143;
            RGB [YP] = 0xff000000 | ((为r&10 6)及为0xFF0000)| ((g取代;→2)及为0xFF00)| ((B个大于10)及0xff的);
        }
    }
}
 
西数 WDC WD5000BPVT 24HXZT3 蓝盘 换STAT转接,

解决方案

我敢肯定,你会发现现场preVIEW测试应用程序,有趣的......它是Android操作系统code在最新的果冻部分豆(MR1)。它实现了一个摄像头preVIEW并使用ScriptIntrinsicYuvToRgb到preVIEW数据Renderscript转换。您可以浏览源代码在线浏览:

Live$p$pview

this is my first question here in stackoverflow.

My Problem is: I've set up a camera in Android and receive the preview data by using an onPreviewFrame-listener which passes me an byte[] array containing the image data in the default android YUV-format (device does not support R5G6B5-format). Each pixel consists of 12bits which makes the thing a little tricky. Now what I want to do is converting the YUV-data into ARGB-data in order to do image processing with it. This has to be done with renderscript, in order to maintain a high performance.

My idea was to pass two pixels in one element (which would be 24bits = 3 bytes) and then return two ARGB pixels. The problem is, that in Renderscript a u8_3 (a 3dimensional 8bit vector) is stored in 32bit, which means that the last 8 bits are unused. But when copying the image data into the allocation all of the 32bits are used, so the last 8bit get lost. Even if I used a 32bit input data, the last 8bit are useless, because they're only 2/3 of a pixel. When defining an element consisting a 3-byte-array it actually has a real size of 3 bytes. But then the Allocation.copyFrom()-method doesn't fill the in-Allocation with data, argueing it doesn't has the right data type to be filled with a byte[].

The renderscript documentation states, that there is a ScriptIntrinsicYuvToRGB which should do exactly that in API Level 17. But in fact the class doesn't exist. I've downloaded API Level 17 even though it seems not to be downloadable any more. Does anyone have any information about it? Does anyone have ever tried out a ScriptIntrinsic?

So in conclusion my question is: How to convert the camera data into ARGB data fast, hardwareaccelerated?

That's how to do it in Dalvik VM (found the code somewhere online, it works):

    @SuppressWarnings("unused")
private void decodeYUV420SP(int[] rgb, byte[] yuv420sp, int width, int height) {  
    final int frameSize = width * height;  
    for (int j = 0, yp = 0; j < height; j++) {
        int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;  
        for (int i = 0; i < width; i++, yp++) {  
            int y = (0xff & ((int) yuv420sp[yp])) - 16;  
            if (y < 0)
                y = 0;  
            if ((i & 1) == 0) {  
                v = (0xff & yuv420sp[uvp++]) - 128;  
                u = (0xff & yuv420sp[uvp++]) - 128;  
            }  
            int y1192 = 1192 * y;  
            int r = (y1192 + 1634 * v);  
            int g = (y1192 - 833 * v - 400 * u);  
            int b = (y1192 + 2066 * u);  
            if (r < 0)
                r = 0;
            else if (r > 262143)
                r = 262143;  
            if (g < 0)
                g = 0;
            else if (g > 262143)
                g = 262143;  
            if (b < 0)
                b = 0;
            else if (b > 262143)  
                b = 262143;  
            rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff);  
        }
    }
}

解决方案

I'm sure you will find the LivePreview test application interesting ... it's part of the Android source code in the latest Jelly Bean (MR1). It implements a camera preview and uses ScriptIntrinsicYuvToRgb to convert the preview data with Renderscript. You can browse the source online here:

LivePreview