This content has been marked as final. Show 3 replies
There are several things you seem to be doing wrong.
The PixelGrabber class is used to retrieve pixel information from Toolkit images, since the Image class doesn't have any convenient way to obtain rgb information. A BufferedImage, on the other hand, has about 4 ways you can retrieve pixel information depending on how low level you want to go. One of those ways is to simply call BufferedImage#getRGB. No PixelGrabber needed.
The second thing you're doing wrong is turning an int array into a byte array by casting the ints into bytes. Each byte in the int represents a color - alpha, red, green, or blue. By casting you loose all the but the blue component. What you need to do is make the byte array 3 or 4 times the size of the int array. Then store the lowest 3 bytes or all 4 bytes of each int into the byte array using whatever endian you want. I'm sure if you poke around the java api, you can find classes that do exactly this for you. In fact, if you wrap your network stream using DataOutpustStream and DataInputStream then...
The third thing you seem to be doing wrong is trying to send raw images over a network. Images can very quickly get to be 100's of KB to MB in size. Compare your current method to compressing the image first (ImageIO#write + ByteArrayOutputStream), sending the compressed data, and then turning it back into a java Image (ImageIO#read + ByteArrayInputStream or Toolkit). You spend time compressing/decompressing the image, but you'll probably spend a lot less time sending much less data over the network.
Thanks for your reply.
There are a couple of things that i missed or misquoted in my question.
First, I shouldn't have said network, as am accessing a custom made hardware device using USB protocol. Second, as my hardware device has very stringent memory requirements, I thought the conversion of individual elements in integer array to byte value would help me. And third, as the destination is a hardware device, am not sure whether decompression will be supported by the hardware, and for the purpose of making the number of bytes to be transferred, a constant, i chose to send the pixel data, instead of the byte data i get from
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(image, format, baos);
I were sure I were losing information by converting integer to byte, but weren't sure what information i were losing. Will try sending all the pixel information, so as to make the number of bytes to be transferred a constant, or if its too huge, will try to send the byte array constructed from the above code snippet itself.
Anyways, thanks for your reply.
I hear there's a complicated way you can get the JPEGImageWriter that comes with java to essentially 'set' the file size of the output. So you can take any java image you want and turn it into a jpg file of exactly the size you specify. Of course, the lower the file size the more information you loose.
However, if compressing/decompressing using the png or jpg format is out of the question (since it's not constant and the other side may not have a means to decode it), but you still need some sort of byte reduction, then consider drawing the image on a TYPE_USHORT_565_RGB BufferedImage. Then send the resulting short array
The 565 image uses 5 bits for the red, 6 bits for the green, and 5 bits for the blue. That is, you're turning your image into a 16 bit color image. The change is visually negligible and the number of bytes to be transferred will still be constant. This is the most compression you're going to get without using ImageIO or writing your own stuff on the other end.
short toSend = ((DataBufferUShort) bufferedImage.getRaster().getDataBuffer()).getData();