Hi,
I need help with concatenating several bits values to one big value.
For example:
lets say I have these char variables which I want to concatenate to one value.

unsigned char x = 0x02; //00000010
unsigned char y = 0x01; //00000001
unsigned char z = 0x03; //00000011

and the output should be,e.g. 000000100000000100000011

Can some please advice on how do I do this.

Edited 5 Years Ago by Nick Evan: Fixed the starting code-tag.

This is by no means a direct solution to your problem but the code snippet here does demonstrate a method of decimal to binary conversion & string concatenation.

It basically takes three ANSI decimal values 120=x, 121=y & 122=z & converts them to their binary equivalents & concatenates them into a single string.

#include <iostream>
using namespace std;


/**
Convert dec value to binary
*/
char* dec2bin(int n)
{
    char* Bin_value;
    char buf[500];
    int index = 499;
    buf[index] = '\0';
    int neg;
    neg = 0;

    if(n < 0)
    {
        neg = 1;
        n *= -1;
    }
    do
    {
        int d;
        d = n % 2;
        buf[--index] = (char)(d + '0');

        n = n >> 1;
    } while( n > 0);

    if(neg) {
    buf[--index] = '-';
    }
    //printf("%s\n", &buf[index]);
    sscanf(&buf[index], "%s", Bin_value);
    return Bin_value;
}


int main()
{
    string bin_val_1 = dec2bin(-120);//ansi value 120 = x
    string bin_val_2 = dec2bin(-121);//ansi value 121 = y
    string bin_val_3 = dec2bin(-122);//ansi value 122 = z
    //Concatinate string
    string bin_val = bin_val_1 + bin_val_2 + bin_val_3;
    cout << "" << bin_val << endl;
    return 0;
}
Programs Output...
-1111000-1111001-1111010

A point to note binary value returned do not include leading zero's , when in actual fact it should be

-01111000-01111001-01111010

Also note it does not except HEX values, you would have to convert your HEX values to Decimal values for it to work & it has not been checked for errors.

Hope this helps get you started.

Use std::bitset<>, perhaps?

#include <bitset>
#include <limits>
#include <string>
#include <iostream>

int main()
{
    unsigned char x = 0x02 ; 
    unsigned char y = 0x01 ;
    unsigned char z = 0x03 ;

    typedef std::bitset< std::numeric_limits< unsigned char >::digits > bits ;
    std::cout << bits(x).to_string() + bits(y).to_string() + bits(z).to_string() << '\n' ;
}
This article has been dead for over six months. Start a new discussion instead.