I have declared a 2D vector that look like this.
Now I need to have a 3D vector but dont know how you declare a 3D vector.
As seen I have put 500 as a value for 1D.
Then I push_back the 2D and I also will push_back the 3:rd dimension.
Thanks in advance!

std::vector<std::vector<string> > TwoDimensions(500, std::vector<string>());

Recommended Answers

All 6 Replies

As usually in C and C++, typedef lends a helping hand:

typedef std::vector<std::string> String1D;
typedef std::vector<String1D> String2D;
typedef std::vector<String2D> String3D;
// Now slightly terrible construct for judges of a vector as a true array surrogate:
String3D s3d(2,String2D(2,String1D(2)));
// End of nightmare. Now we have 2x2x2 matrix (of strings?)...
s3d[1][1][1] = "111";

Wonderful.. There will be one thing I have to do and that is to use push_back for 2D and 3D but not for 1D.
I have tried to put it like this but when I compile, the compiler says:
"syntax error: ´)´

I have not declared something like this before : )

typedef std::vector<string> String1D;
typedef std::vector<String1D> String2D;
typedef std::vector<String2D> String3D;

String3D abc(1000,  String2D((), String1D()  ));

As usually in C and C++, typedef lends a helping hand:

typedef std::vector<std::string> String1D;
typedef std::vector<String1D> String2D;
typedef std::vector<String2D> String3D;
// Now slightly terrible construct for judges of a vector as a true array surrogate:
String3D s3d(2,String2D(2,String1D(2)));
// End of nightmare. Now we have 2x2x2 matrix (of strings?)...
s3d[1][1][1] = "111";

Some syntax sugar (I don't like inheritance from STL classes, but it works;)):

typedef std::vector<std::string> String1D;
typedef std::vector<String1D> String2D;
typedef std::vector<String2D> String3D;

struct StringMatrix: public String2D
{
  StringMatrix(int m, int n):String2D(m,String1D(m)) {}
};

struct StringCube: public std::vector<StringMatrix>
{
  StringCube(int m, int n, int k):std::vector<StringMatrix>(m,StringMatrix(n,k)) {}
};
// Now:
	StringCube cube(3,3,3);
	cube[1][1][1] = "111";

And so on...

Just so you know, the typedefs aren't necessary to do what you want... they just make it much easier. You had the right idea on your own. If this vector<string> is a 1D vector, and this vector<vector<string>> is a 2D vector, then following the pattern, you should've been able to to come up with this: vector<vector<vector<string>>> as a 3D vector.

Yes, that is true. I tried to follow the pattern. I declare a 2D and a 3D here:

//2D
vector<vector<string> > abc(500, vector<string>()); 

//3D
vector<vector<vector<string>>> abcde(500, vector<string>(), vector<string>());

The 2D compiles and I have tried to follow the pattern for the 3D but I have a lot of compilererrors with the 3D.

Just so you know, the typedefs aren't necessary to do what you want... they just make it much easier. You had the right idea on your own. If this vector<string> is a 1D vector, and this vector<vector<string>> is a 2D vector, then following the pattern, you should've been able to to come up with this: vector<vector<vector<string>>> as a 3D vector.

vector<vector<vector<string>>>

The 2D compiles and I have tried to follow the pattern for the 3D but I have a lot of compilererrors with the 3D.

You should write it like : vector<vector<vector<string> > > abcde; (with spaces between the >>>.) Else the compiler will think that you mean the >> operator

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.