Hi, I have problem putting a value to 16 bits variable. Here is, how I am doing it.

``````typedef unsigned int UINT;
typedef UINT* ID_PTR;

unsigned short int myclass[5];
const ID_PTR  arr16_bit[5] = {
(UINT*) &myclass[0],
(UINT*) &myclass[1],
(UINT*) &myclass[2],
(UINT*) &myclass[3],
};``````

Now accessing the myclass like this.

``*(*(arr16_bit + 2)) =  65000;``

I can access the array, but the 65000 is cut to 8 bits. Why?

/thanks
kursist

3
Contributors
3
Replies
4
Views
10 Years
Discussion Span
Last Post by kursist

how did you decide this is the case?

and how big is your "int"?

Sorry dude, but it accesses fine (for the case you gave).

``````main(){
*(*(arr16_bit + 2)) =  65000;

cout << dec << **(arr16_bit + 2) << " " << *arr16_bit[2] << endl;

system("pause");
}``````

You'll probably find that using a pointer to a 32bit int to write to a 16bit int overwrites the next 16bit int. i.e. Make sure you access 16bit numbers with a pointer to a 16bit number.

So use `typedef unsigned short int UINT;` (Note the addition of 'short').

Thanks, This solved the problem.

``````unsigned int* ptr =(unsigned int *)*(arr16_bit + 5);

*ptr = 10000;``````

:)

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.