I have a strange problem in VS 2008. I have ascii - codes of simbols and I want to have a String^ variable as result of converting. That is my code:

int arr[10];//array with ascii codes of simbols.
unsigned char a;
String^ str;
for ( int i = 0; i<10; i++) 
     a = arr[i];

But if in arr = 178, for example, i dont get '▓' in var a, but some another simbol.

Your first problem is mismatched data.

Your forcing signed into unsigned as well as reducing the bits, therefore contaminating your data!

int arr[]

but assigning it to an unsigned char

a = arr;

should be
#defne MAX_ARR 10

unsigned char arr[ MAX_ARR ];

arr must be int. I tryed to define it as unsigned int, but is not a right way to solve a problem. Also i tryed to change options "Character Set" in project properties from "Use Unicode" to "Not Set" and to "Use Muli-Byte", but it wasnt a right way too.

Where did you hear that using unsigned int is not the right way to solve a problem. Quite often professors are stuck in the dark ages of BASIC and 1st release C. I've taught classes where I've had to re-educate students to the modern day revised methods.

Buggy Whips are great for horse drawn carriages, but we use electric cars now! Well not quite but I've made my point!

In true structured programming the correct data type should be used. There are no negative characters, thus no negative sign is ever needed, thus they are unsigned. In the olden days 7-bit ASCII was all there was, especially in the days of the teletype, but then it was expanded to 8-bit, then Multi-Byte, and then Wide-Char which includes Unicode.

So your integer array MUST me unsigned int!!!!

But I'm going to show you both methods to convert your data the non C++ way.

int arr[];
unsigned char a;

Using multiple casts.
a = (unsigned char)(unsigned int)arr[ i ];

The sign bit is unused so while the integer value is stored in a larger then 7-bit form, then change to unsigned, then reduce the bit count.

unsigned int arr[];
unsigned char a;
a = (unsigned char) arr[ i ];

a = static_cast<unsigned char> (arr);

I dont say that unsigned int is a mistake. But using unsigned int cant solve this problem. I think, problem is in ansi/unicode using or something like this.