I'm trying to convert hexadecimal values to Binary.
I've done Decimal values to Binary up to 32 bits in this way:

#include <stdio.h>
int showbits(int);/**************function*prototype******************************/

int main()
{
unsigned int num;

printf("enter the number.");
scanf("%d",&num);

printf("%d in binary is  ",num);
printf("\n");

showbits(num);/*********************function*call********************************/

return 0;
}

showbits (int n)/******************function*definition****************************/
																				//*
{
  int i,k,andmask;

  for(i=31;i>=0;i--)
  {
     andmask = 1<<i;
     k=n&andmask;
     k==0?printf("0"):printf("1");
  }
}

Now I wan't to add hexadecimal values in it.

What makes you think there's a difference? Decimal vs. hexadecimal is a display representation, the underlying bits of corresponding values are unchanged.

What do you think the problem is?
I know

Decimal vs. hexadecimal is a display representation, the underlying bits of corresponding values are unchanged

In the above program, the user can only enter decimal digits 0-9.
I need a code that should be able to give :
00000000000000000000000000001111 for the digit F.

Note that scanf is converting the display representation to the value, so %d will look for decimal values. %x will allow you to read hexadecimal values.

Well,this is embarrassing.I'd never thought it that way.
I was looking for a data type that can hold alphanumeric digits.
But hey that's what discussion forums are for, right?
Thanks for the help Narue.

This article has been dead for over six months. Start a new discussion instead.