Hello,

I am programming an image application with a GUI.
I open a grayscale image like this:

String^ s = openFileDialog1->FileName;
s = s->Substring(s->LastIndexOf('\\') + 1, (s->Length - s->LastIndexOf('\\')) - 1);
InputImageFileName = s;
bmpInputImage = gcnew System::Drawing::Bitmap(openFileDialog1->FileName);

then I create a new image

binaryImage=gcnew System::Drawing::Bitmap(bmpInputImage);

and I want to work with this image as a unsigned char* so I can use each value of the image to make an histogram and more operations.

I have found that but in the case that it works,(I am not sure), I do not know how to use it:

System::Drawing::Image::Save(System::String^ filename, System::Drawing::Imaging::ImageFormat^  format)
//then I will apply format->ToString();

Could anyone of you help me?
Thanks in advanced!!

You can't just cast an image into a char array. Besides: Microsoft has given us a lot of functions to play with:

Bitmap^ binaryImage=gcnew System::Drawing::Bitmap(bmpInputImage);
Color pixelColor = binaryImage->GetPixel( 1, 1 ); //get pixel x=1 y=1
Byte g = pixelColor.G;
Byte b = pixelColor.B;
Byte r = pixelColor.R;
Byte a = pixelColor.A;

Now the vars r,g,b,a will hold the color values of pixel 1,1
(NOTE: this is untested)

so this way I have got the colour value but how can I change the bytes to unsigned char?
because the idea is to have the value of the pixels of a bitmap image so that I can make an histogram and to access to this bitmap like this:

int histoArray[256];
for(unsigned int a=0;a<256;a++)
{
       for(unsigned int b=0;b<256;b++)
{
		histoArray[bitmap[a][b]]+=1;
}
}

Tell me if this works:

unsigned char valr = (unsigned char)pixelColor.G;
std::cout << valr << '\n';

if not: please post the error message. I haven't worked with these kinds of thing for a while..

It does not work. Look the value of valG. I think that the casting is not working properly.

Attachments valG.JPG 187.87 KB

the color is what I was expecting but I don't understand what is the meaning of the symbol next to the value. Depending on the pixel there is a symbol or a number. What does it mean?

because it's a char your debugger will show you the character that the ASCII value represents. But since decimal 2 = 'start of text character' the debugger shows a little square :) If the value was 33 it would have shown you a '!'.

Don't worry about it, you can still make calculations with the value

Here's the ASCII table

ok!!ASCII...Good to know!!
thank you so much for your help, I am going to mark this thread as solved.
Thank you!!

This question has already been answered. Start a new discussion instead.