Hi,

Ive written a program which is capable of sorting an array of ints e.g. array[6] = {4,6,-1,0,6,4} however i want to expand this so it will be able to order chars as well. With my code as it is, it will order chars correctly i.e. sort the numbers out first, then the chars however this relies on no int being greater than the ASCII equivalent decimal number of the char.

i am looking for some ideas on how to approach passing my sorting function an array such as {A,67,9,0,B,D} and for it to sort it to {0,9,67,A,B,D}. I already have the sorting algorithm in place for ints, i just need some ideas on how to expand it for chars.

Many thanks.

A really bad solution (i think) would be create 2 vectors push all ints i.e if they are between 2 ASCII values into 1 then all chars, same again, into another, sort them, then use the 2 vectors to repopulate the array.
I think it would work but i imagine there is a far simpler and elegant way to do it.

Sorry for my ignorance, but when you're passing an array like you showed: "{A,67,9,0,B,D}" are the characters in that array actually the ASCII codes?

If so then would the array look like: {65,67,9,0,66,68}?

What I get from the OP is that he wants to send integers and characters in one array to his function so that it sorts it out, but wouldn't it be hard figuring out which ones are supposed to be the integers and vice versa if the characters are in ASCII code?

Or is there a way to mix integers and characters in one array in C++?

Again, sorry for my ignorance. I'm in the early stages of C++/Programming logic.

Sorry for my ignorance, but when you're passing an array like you showed: "{A,67,9,0,B,D}" are the characters in that array actually the ASCII codes?

If so then would the array look like: {65,67,9,0,66,68}?

I think it would be best to sort them seperately because a char technically speaking is not smaller or bigger than an int and you cant sort its ascii code along with other numbers it would mess up the. It would look like : {65,67,9,0,66,68}

Why not just convert to all ASCII first. Not sure how to do '67', but the unsorted array would look like [65, (67 in ASCII), 57, 48, 66, 68]. Then sort it with your int sorter and convert back. The 2+ digit numbers might be a problem, but a function to convert all the elements into ASCII first seems the simplest.

1.Make sure everything is kept as a character; use some denotation as terminator so you can have multi-digit numbers without much difficulty.


2.Convert these chars to their ASCII equivalent integer. If the ASCII value of the char is between 0-9, give it precedence over any non-integer character. A simple boolean variable will suffice. Make sure your loop is checking for those terminators so you can keep multi-digit numbers together.

3.Use a standard sorting algorithm, I'm sure you've already set one up for your initial program.

So from the replies here i have concluded i will need to find some way to register when there is a char present as opposed to an INT. Looking at the ASCII values table how would i tell the difference between;

an INT size 68 and a char of value 'D' (which has the decimal equivalent of its ASCII 68)

http://www.asciitable.com/

Many thanks,

What you need is basically a discriminated union (often called a variant type). This is just a union of an integer and a character along with a flag to tell which type is actually stored in the union. Here is a simple example:

class CharOrInt {
  private:
    union {
      char c;
      int i;
    } data;

    enum { char_data, int_data } data_type;

  public:
    explicit CharOrInt(char c) { data.c = c; data_type = char_data; };
    explicit CharOrInt(int i) { data.i = i; data_type = int_data; };

    CharOrInt& operator=(char c) { data.c = c; data_type = char_data; };
    CharOrInt& operator=(int i) { data.i = i; data_type = int_data; };

    bool operator <(const CharOrInt& rhs) const {
      if(data_type == char_data) {
        if(rhs.data_type == char_data) 
          return data.c < rhs.data.c;
        else
          return ((data.c < '0') || ((data.c <= '9') && (data.c - '0' < rhs.data.i)));
      } else {
        if(rhs.data_type == char_data) 
          return ((rhs.data.c > '9') || ((rhs.data.c >= '0') && (rhs.data.c - '0' > data.i)));
        else
          return data.i < rhs.data.i;
      };
    };
};

You can basically used the above data type instead of the int type in your sorting algorithm and it should work as you want. If not, just modify it to suit your needs.

This article has been dead for over six months. Start a new discussion instead.