Hi
i'm following a book but i don't understand std::string::size_type and some other data types specially typedef
i see programmers use them too much.

typedef int grades[5];
grades studenA[5], studentB[5];

and

why just not use int instead of size_type while the result is same.

#include <iostream>
#include <string>
#include <algorithm>

int main(){

    const std::string hexdigits = "0123456789ABCDEF";
    std::cout << "Enter a series of numbers between 0 and 15"
        << " separated by spaces"<< std::endl;

    std::string result=""; 
    //std::string::size_type n;
    int n;
    while (std::cin >> n)
        if (n < hexdigits.size())
            result += hexdigits[n];
    std::cout << "Your hex number is: " << result << std::endl;

    std::cin.get();
    std::cin.ignore();

    return 0;
}

Edited 2 Years Ago by Sarkurd

Well, the size_type of any container (or std::string) is just the preferred type for indices and sizes related to that container. It is usually fine to use another integer type, such as int. There are a few dangers associated with not using size_type but they are very unlikely to happen in nearly all circumstances.

For example, if you do this for-loop:

for(int i = 0; i < str.size(); ++i)
  ...

The type of size() is size_type, and if that type is larger than int, then the value returned by str.size() could larger than what can be represented by the int type. This will lead to an infinite loop because when i reaches the maximum of its range, it will wrap around to the minimum of its range (large negative number) and the whole thing will never stop because i can never reach a value larger than size().

Using size_type guarantees that you won't get those kinds of problems, but obviously, you can see that such problems won't occur when the sizes or indices are small. So, just consider using size_type as the "playing it safe" option, which is what you would do when writing industrial-strength code.

re. the other part of your question ...

the use of typedef can either be

helpful to your code flow

or

hinder (hide from) your understanding

what the code is doing

For an example of common use:
One can easily make iterators to arrays using typedef

For an example of questionable/controversial use:
some hiding of pointers with typedef

There ... now you can:

Google typedef hiding pointer problem

and see here:

http://discuss.fogcreek.com/joelonsoftware1/default.asp?cmd=show&ixPost=10506

Edited 2 Years Ago by David W

I'm not a big fan of using typedefs to hide a pointer type. I just adds confusion and usually doesn't make the syntax simpler, often the opposite (int* becomes IntPtr or something like that). Even when using smart-pointers (as one should!) like std::unique_ptr or std::shared_ptr, you still need to make it clear in the name of the typedef what kind of smart-pointer it is, like std::unique_ptr<int> becomes IntUniquePtr, which is again not a huge gain in terms of syntax, and it often still leads to some confusion or some non-idiomatic code.

Other practical reasons people use typedefs are:

Reduce large template types and nested types to something smaller for within the body of a function, like so:

 typedef typename std::vector<T>::iterator Iter;
 for(Iter it = v.begin(), it_end = v.end(); it != it_end; ++it)
    ...

Naming the actual type only in a single place, instead of everywhere where it is used. Like this:

template <typename T>
class vector {
  public:
    typedef std::size_t size_type;

    size_type size() const;
    size_type capacity() const;

    void resize(size_type new_sz);
    ...
    // If std::size_t was used everywhere, then changing it would mean
    // you would have to change it everywhere. But now, you only have 
    // to change the typedef.
    // Note that the same is true for within a large function.
};

Hide away template types to make them appear as simple types, like this example from the standard (the std::string class):

namespace std {

  typedef basic_string<char, char_traits<char>, allocator<char>> string;

};

And, of course, when you do template meta-programming, typedefs are used all over the place because they are basically the meta-programming equivalent of variables in ordinary programming.

This question has already been answered. Start a new discussion instead.