相关文章推荐
暴躁的香烟  ·  opencv yuv转rgb - CSDN文库·  2 周前    · 
逆袭的鸵鸟  ·  python opencv yuv 转 ...·  2 周前    · 
飞翔的豆浆  ·  BT601/BT709/BT2020 ...·  1 周前    · 
考研的丝瓜  ·  JavaScript ...·  10 月前    · 
越狱的牛肉面  ·  VBA ...·  1 年前    · 
Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams image = reader.acquireLatestImage(); Image.Plane[] planes = image.getPlanes(); ByteBuffer buffer = planes[0].getBuffer(); byte[] data = new byte[buffer.capacity()]; buffer.get(data); //data.length=332803; width=3264; height=2448 Log.e(TAG, "data.length=" + data.length + "; width=" + image.getWidth() + "; height=" + image.getHeight()); //TODO data processing } catch (Exception e) { e.printStackTrace(); if (image != null) { image.close();

Each time length of data is different but image width and height are the same.
Main problem: data.length is too small for such resolution as 3264x2448.
Size of data array should be 3264*2448=7,990,272, not 300,000 - 600,000.
What is wrong?

imageReader = ImageReader.newInstance(3264, 2448, ImageFormat.JPEG, 5);

I solved this problem by using YUV_420_888 image format and converting it to JPEG image format manually.

imageReader = ImageReader.newInstance(MAX_PREVIEW_WIDTH, MAX_PREVIEW_HEIGHT, 
                                      ImageFormat.YUV_420_888, 5);
imageReader.setOnImageAvailableListener(this, null);
Surface imageSurface = imageReader.getSurface();
List<Surface> surfaceList = new ArrayList<>();
//...add other surfaces
previewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
previewRequestBuilder.addTarget(imageSurface);
            surfaceList.add(imageSurface);
cameraDevice.createCaptureSession(surfaceList,
                    new CameraCaptureSession.StateCallback() {
//...implement onConfigured, onConfigureFailed for StateCallback
}, null);
@Override
public void onImageAvailable(ImageReader reader) {
    Image image = reader.acquireLatestImage();
    if (image != null) {
        //converting to JPEG
        byte[] jpegData = ImageUtils.imageToByteArray(image);
        //write to file (for example ..some_path/frame.jpg)
        FileManager.writeFrame(FILE_NAME, jpegData);
        image.close();
public final class ImageUtil {
    public static byte[] imageToByteArray(Image image) {
        byte[] data = null;
        if (image.getFormat() == ImageFormat.JPEG) {
            Image.Plane[] planes = image.getPlanes();
            ByteBuffer buffer = planes[0].getBuffer();
            data = new byte[buffer.capacity()];
            buffer.get(data);
            return data;
        } else if (image.getFormat() == ImageFormat.YUV_420_888) {
            data = NV21toJPEG(
                    YUV_420_888toNV21(image),
                    image.getWidth(), image.getHeight());
        return data;
    private static byte[] YUV_420_888toNV21(Image image) {
        byte[] nv21;
        ByteBuffer yBuffer = image.getPlanes()[0].getBuffer();
        ByteBuffer vuBuffer = image.getPlanes()[2].getBuffer();
        int ySize = yBuffer.remaining();
        int vuSize = vuBuffer.remaining();
        nv21 = new byte[ySize + vuSize];
        yBuffer.get(nv21, 0, ySize);
        vuBuffer.get(nv21, ySize, vuSize);
        return nv21;
    private static byte[] NV21toJPEG(byte[] nv21, int width, int height) {
        ByteArrayOutputStream out = new ByteArrayOutputStream();
        YuvImage yuv = new YuvImage(nv21, ImageFormat.NV21, width, height, null);
        yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);
        return out.toByteArray();
public final class FileManager {
    public static void writeFrame(String fileName, byte[] data) {
        try {
            BufferedOutputStream bos = new BufferedOutputStream(new FileOutputStream(fileName));
            bos.write(data);
            bos.flush();
            bos.close();
//            Log.e(TAG, "" + data.length + " bytes have been written to " + filesDir + fileName + ".jpg");
        } catch (IOException e) {
            e.printStackTrace();
                I've used the ImageUtil class from the answer to convert YUV_420_888 to JPEG however I am not getting a correct output. By using this class will it be possible to save the output byte[] into a JPEG file or maybe even display it on a SurfaceView? I am trying to do so however I'm getting distorted images.
– ahasbini
                May 17, 2017 at 9:45
                data = NV21toJPEG(YUV_420_888toNV21(image), image.getWidth(), image.getHeight()); does distortions?
– Volodymyr Kulyk
                May 17, 2017 at 9:55
                Yes I've placed the class as is and implemented the writing file similar to your implementation but using FileOutputStream instead, and the final image is distorted. I could show you the complete implementations via git repo, I'll push in a bit.
– ahasbini
                May 17, 2017 at 9:58

I am not sure, but I think you are taking only one of the plane of the YUV_420_888 format (luminance part).

In my case, I usually transform my image to byte[] in this way.

            Image m_img;
            Log.v(LOG_TAG,"Format -> "+m_img.getFormat());
            Image.Plane Y = m_img.getPlanes()[0];
            Image.Plane U = m_img.getPlanes()[1];
            Image.Plane V = m_img.getPlanes()[2];
            int Yb = Y.getBuffer().remaining();
            int Ub = U.getBuffer().remaining();
            int Vb = V.getBuffer().remaining();
            data = new byte[Yb + Ub + Vb];
            //your data length should be this byte array length.
            Y.getBuffer().get(data, 0, Yb);
            U.getBuffer().get(data, Yb, Ub);
            V.getBuffer().get(data, Yb+ Ub, Vb);
            final int width = m_img.getWidth();
            final int height = m_img.getHeight();

And I use this byte buffer to transform to rgb.

Hope this helps.

Cheers. Unai.

Thank you for response! Currently i'm using ImageFormat.JPEG, so m_img.getPlanes().length is equal to 1. – Volodymyr Kulyk Oct 18, 2016 at 14:48 Great, you are welcome, I didn't know what was your purpose with this byteArray. Sorry for my delayed answer. – uelordi Oct 18, 2016 at 15:27

Your code is requesting JPEG-format images, which are compressed. They'll change in size for every frame, and they'll be much smaller than the uncompressed image. If you want to do nothing besides save JPEG images, you can just save what you have in the byte[] data to disk and you're done.

If you want to actually do something with the JPEG, you can use BitmapFactory.decodeByteArray() to convert it to a Bitmap, for example, though that's pretty inefficient.

Or you can switch to YUV, which is more efficient, but you need to do more work to get a Bitmap out of it.

Your current solution seems to be to reshuffle the YUV data to NV21, and then compressing it to a JPEG; it's not clear what you want the YUV for, but it's certainly cheaper to just ask for JPEG if all you want to do is to save the data. You can ask for both JPEG and YUV, depending on resolution, which might be most efficient option, though JPEG output may have a lower frame rate than 30fps. – Eddy Talvala Oct 24, 2016 at 0:17

Thanks for contributing an answer to Stack Overflow!

  • Please be sure to answer the question. Provide details and share your research!

But avoid

  • Asking for help, clarification, or responding to other answers.
  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.