Deadmon 0 Newbie Poster

Hello all, I have a small question here.
Let's say I want to tokenize a string(or multiple strings) from input.
For example, my input file has the following:
1
2
3

And this is my tokenizing code:

char line[256];
vector < char *>tokenize(FILE * in) {
    vector <char*>tlist;
	char* word;
	while(1){
    fgets(line, 256, in);
	if (feof(in)) break;
    word = strtok(line, " ,. \n");
    tlist.push_back(word);
    while (word != NULL) {
	word = strtok(NULL, " ,. \n");
	if (word==NULL) break;
	tlist.push_back(word);
       }
  }
	
     for (i=0;i<(tlist.size());i++){
	 cout<<tlist[i]<<"\n";
     }
    return tlist;

What it should be doing is simply storing the tokens a b c into the vector, which I then verify by printing it out.
However, it seems everytime the program reads the next line and pushes the token onto the stack, that token overwrites everything else.

So with that code, my program would simply print out "3" for all elements of the stack.

Any advice?

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.