I have been trying to create a script that will fetch an image file, store it in a 2d grayscale non-alpha byte array based on the x and y axis so I can perform math operations then save that array back to a new image file. I have come close to doing so but something is not doing its job in the process. The resulting image has a lot of dark horizontal static and I upon the many pages I have searched and searched. Still no answer. Could somebody please explain what I am doing wrong here? Below is the basics of what I am doing and between the two big comment boxes is where I will be modifing "byte data[x][y]".

package org.test.proc;

import java.awt.image.BufferedImage;
import java.io.ByteArrayInputStream;
import java.io.File;
import java.io.IOException;
import java.awt.image.DataBufferByte;

import javax.imageio.ImageIO;

public class Main {

    public static void main(String[] args) {
        System.out.println(args[0]);
        BufferedImage img = null;
        try {
            img = ImageIO.read(new File(args[0]));
        } catch (IOException e) {
            e.printStackTrace();
        }


        int w=img.getWidth();
        int h=img.getHeight();
        byte[][] data_a = Main.getByteArray(img);
        //////////////////////////////////
        ///    ALGORITHM PHASE START   ///
        //////////////////////////////////




        /////////////////////////////////
        ///    ALGORITHM PHASE END    ///
        /////////////////////////////////

        byte[] data_l = new byte[w*h*3];
        int i=0;
        for (int x=0;x<w;x++) {
            for (int y=0;y<h;y++) {
                data_l[i]=data_a[x][y];
                i++;
                data_l[i]=data_a[x][y];
                i++;
                data_l[i]=data_a[x][y];
                i++;
                //data_l[i]=127;
                //i++;
            }
        }

        try {
            //BufferedImage image=null;
            //image = ImageIO.read(new ByteArrayInputStream(data_l));
            //System.out.println(image.toString());
            //ImageIO.write(image, "JPG", new File("test.jpg"));
            BufferedImage image = new BufferedImage(w, h, BufferedImage.TYPE_3BYTE_BGR);
            image.getWritableTile(0, 0).setDataElements(0, 0, w, h, data_l);
            ImageIO.write(image, "PNG", new File("test.png"));
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }


    }

    private static byte[][] getByteArray(BufferedImage image) {
//http://stackoverflow.com/questions/6524196/java-get-pixel-array-from-image
          final byte[] pixels = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
          final int width = image.getWidth();
          final int height = image.getHeight();
          final boolean hasAlphaChannel = (image.getType() != BufferedImage.TYPE_3BYTE_BGR);

          byte[][] result = new byte[width][height];
          if (hasAlphaChannel) {
              for (int pixel = 0, row = 0, col = 0; pixel < pixels.length; pixel += 4) {
                 int argb = pixels[pixel+1]; // blue
                 argb += pixels[pixel + 2]; // green
                 argb += pixels[pixel + 3]; // red
                 argb/=3;
                 result[col][row] = (byte)argb;
                 col++;
                 if (col == width) {
                    col = 0;
                    row++;
                 }
              }
          } else {
              for (int pixel = 0, row = 0, col = 0; pixel < pixels.length; pixel += 3) {
                 int argb = pixels[pixel]; // blue
                 argb += pixels[pixel+1]; // green
                 argb += pixels[pixel+2]; // red
                 argb/=3;
                 result[col][row] = (byte)argb;
                 col++;
                 if (col == width) {
                    col = 0;
                    row++;
                 }
              }
          }

          return result;
       }

}

Could somebody please take a look at this and tell me why the resulting image is just black static and not the same as the input image.

Thank you
Hoping for the best

cwarn23

Recommended Answers

All 8 Replies

Maybe its a problem with the byte arithmetic? The RGB byte values are unsigned (0 to +255), but Java bytes are signed (-128 to +127). You may need to do some masking (ANDing with hex FF) to stop the first bit being propogated as a sign bit when you convert byte to int or vice-versa?

Eg. consider this simplified case:

        byte x = (byte) 127;
        byte y = (byte) 127; 
        byte z = (byte) (x + y);
        System.out.println(x+" "+y+" "+z);

I bet you expect to 254 in z. But it's -2.

I just tried subtracting 128 from argb in the getbyteArray() method however I'm getting an out of bounds error. I suspect I've done something wrong with the if statement that contains the variable (hasAlphaChannel) in the getbyteArray() method. Does anybody know a better way to check the number of bytes per color in the byte[] pixels array and decypher which index is which color.

Much appreaciated
cwarn23

Might be easier to use the getRGB(int x, int y) so that you don't have to attempt to support all 14 image types that the that ImageIO supports.

commented: This was the answer. Thanks +12

Rather than getting the Image's buffer, and having to deal with all he types (as Paul mentions) you could use the PixelGrabber class to get an array of SRGB ints for any or all of any Image, regardless of its internal format. The API doc for PixelGraber also documents the correct way to extract the individual components from the int values.
If you are happy that your Image will always be a BufferedImage then Paul's getRGB is even easier.

commented: Thanks for the great help! +12

I've simplified the method that converts the pixels from the image to byte values however I'm still getting a diagonal static like when you have no reception on an analog tv. I'm not sure what could be causing such a problem. Can anybody else spot what errors would be causing this. It would be most generous to post some corrected code as I've been struggling for over a week now on the above snippet of code.

private static byte[][] getByteArray(BufferedImage image) {
          final int width = image.getWidth(null);
          final int height = image.getHeight(null);
          int[] pixels = new int[width*height];
          byte[][] result = new byte[width][height];
          for (int index = 0, row = 0, col = 0; index < pixels.length; index++) {
             Color c = new Color(image.getRGB(col,row));
             int argb=c.getRed();
             argb+=c.getBlue();
             argb+=c.getGreen();
             argb/=3;
             result[col][row] = (byte)(Math.floor(argb+0.5)-128);
             col++;
             if (col == width) {
                col = 0;
                row++;
             }
          }
          return result;
       }

The above method posted is the new version of the BufferedImage to 2d Byte array method however problems still occure somewhere within the overall script as the generated image is not recognizable as the original image when in fact it should be a copy of it. Later on I will alter the image with the 2d array when I get this working by changing grayscale values so that a different image will be generated but for that I need at least a working image to be generated in the first place.

Any Ideas?
cwarn23

Finally I managed to solve this one. It was not only the image to byte decoder that was messing up the result but it was the byte to image to file encoder that also needed a total rewrite. Unfortunately there are very few references on the web about doing what I wanted to do but there was just enough to find a resolution. I guess my new years resolution. Below is a code snippet for those who want to be able to modify images with Java using the code I have fixed from the first post.

package org.test.proc;
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.IOException;

import javax.imageio.ImageIO;
public class Main {
    public static void main(String[] args) {
        System.out.println(args[0]);
        BufferedImage img = null;
        try {
            img = ImageIO.read(new File(args[0]));
        } catch (IOException e) {
            e.printStackTrace();
        }
        int w=img.getWidth();
        int h=img.getHeight();
        byte[][] data_a = Main.getByteArray(img);
        //////////////////////////////////
        ///    ALGORITHM PHASE START   ///
        //////////////////////////////////






        /////////////////////////////////
        ///    ALGORITHM PHASE END    ///
        /////////////////////////////////
        int pxall=w*h;
        int[] px = new int[pxall];
        for (int i=0,x=0,y=0;i<pxall;i++) {
            px[i] = ((data_a[x][y]+128)<<16) | ((data_a[x][y]+128)<<8) | (data_a[x][y]+128);
            x++;
            if (x==w) {
                x=0;
                y++;
            }
        }
        BufferedImage resimg = new BufferedImage(w, h, BufferedImage.TYPE_INT_RGB);
        resimg.setRGB(0, 0, w, h, px, 0, w);
        try {
            ImageIO.write(resimg, "png", new File("test.png"));
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
    private static byte[][] getByteArray(BufferedImage image) {
//http://stackoverflow.com/questions/6524196/java-get-pixel-array-from-image
          final int width = image.getWidth();
          final int height = image.getHeight();
          int[] pixels = new int[width*height];
          byte[][] result = new byte[width][height];
          for (int index = 0, row = 0, col = 0; index < pixels.length; index++) {
             int n = image.getRGB(col,row);
             result[col][row] = (byte)(Math.floor((((n&0xFF)+((n>>8)&0xFF)+((n>>16)&0xFF))/3)+0.5)-128);
             col++;
             if (col == width) {
                col = 0;
                row++;
             }
          }
          return result;
       }
}

Apart from that all being fixed and all. Solved Positive reputation points being added for giving me the right leads.

Line 12 looks wrong - you need to set the RGB components, not just stuff a single byte into the ARGB int (the byte value will be the blue, but if it's <0 then the AR and G will all be 255 because the sign bit for the byte will be propogated all the way through the int value, if its >=0 the AR and G will be 0, and a zero alpha won't help!).

Update: This post referred to the earlier version of the code - the latest version deals with this problem.

Although all that bit-twiddling can work, it's pretty tacky and error-prone.
If I were doing this I would use getRGB and immediatley create a new Color instance from it. Then I can use Color's methods including getRed, getGreen etc, as well as more advanced stuff like darker/brighter or convert to HSB (so I can set the saturation to zero and use that to create a new grey-scale version of the same image). After processing the Color you can then use getRGB to get the int value you need to put back into the buffered image. Safely.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.