Hi,

Recently I have a text file which contain a bunch of numbers :-

1,0,0,1,0................
1,0,1,1,1................

I would like to read these numbers line by line and stored into an array.

However, I get seg fault for this.

char filename[5000];
char dump;

strcpy(filename,argv[2]);


while(!feof(tfPtr))
{
	for (i=0; i<16; i++)
	{
		fscanf(filename, "%d", t[i]);
		fscanf(filename, "%c", dump);
	}
  sum=t[3]+t[5];
  sum2=t[2]+t[4];
}
 fclose(tfPtr);

  exit(0);

When I compile together with filename
./test.exe test.txt
Segmentation fault (core dumped)

Please help or you guys have any other suggestion ? Thanks.

You could try reading the whole file into memory and store it in a character array i.e. char*. Then you could use the function strtok() to "tokenise" the character array, i.e. split up the character array using the delimiters you specify which in your case would be commas etc.. I'd suggest you read up and try your hand with strtok(). If you need it I can give you a little example of how to do this..

Hope this helps.

>fscanf(filename, "%d", t);
>fscanf(filename, "%c", dump);
Don't forget that scanf expects a pointer. If your variable isn't already a pointer, you need to prefix it with the address-of operator:

fscanf(filename, "%d", &t[i]);
fscanf(filename, "%c", &dump);

Hi,

If i assign a pointer based, it would have meant the filename is always fix at all time. I desired something in the end of the day which executes whole bunch of different test.txt namely test.txt, test1.text...etc. If that is the case, I declare a string argument inside to get the filename.

#include <stdio.h>
#include <stdlib.h>
#include <string.h>

char *strtok(char *str1, const char *str2);

#define SIZE 16

main (int argc, char *argv[])
{

FILE *cfPtr; /* for reading data to expand.txt for vector based */
FILE *tfPtr; /* for reading data from svmkb.txt */ 

int nt[SIZE];
int subs[7];
int i;
char *filename;
char *result=NULL;
char input[5000];
char delims=";";
strcpy(filename,argv[2]);

tfPtr = fopen(filename,"rt");
       if (tfPtr == NULL)
       {
          printf("Can't find input file %s",filename);
          exit(-1);
       }

fgets(input,SIZE,tfPtr); 
    if (input == NULL)
    {
       printf("Error on initial read from universe file.\n");
       fclose(tfPtr);
       exit(-1);
    }


while(!feof(tfPtr))
{
     result =strtok(input, delims);
        
	for (i=0; i<16; i++)
	{
 		fscanf(tfPtr, "%d", &result[i]);
		         
        	nt[i]=result[i];
		
	}

>strcpy(filename,argv[2]);
You haven't allocated any memory to filename. An uninitialized pointer is not the same thing as a pointer to infinite memory. You're better off just using an array instead of a pointer, especially in the case of a file name where you can figure out the maximum length of a valid path.

>tfPtr = fopen(filename,"rt");
I'm curious. Why not just use argv[2] instead of juggling with the filename pointer?

>#define SIZE 16
>char input[5000];
>fgets(input,SIZE,tfPtr);
That seems like kind of a waste to me. You give input 5000 characters but only utilize 15 of them for data.

>nt=result;
result is NULL at this point; you'll likely get a segmentation fault.

This article has been dead for over six months. Start a new discussion instead.