0
    #include <ctype.h>
  #include <stdlib.h>
 #include <stdio.h> 
#include <string.h>

 char c;

{
char text[1000];
FILE *fp=fopen("filename", "r");
int i=0;
while((c=fgetc(c))!=EOF)
 {
  do something
}
}
3
Contributors
2
Replies
20
Views
1 Year
Discussion Span
Last Post by deceptikon
0

As I recall you should use fseek and ftell to get the file size, them malloc to allocate enough space to hold the entire file, then rewind and fread to read the entire file at once.

0

As I recall you should use fseek and ftell to get the file size[...]

That's fine for simple use cases, but it gets troublesome very quickly. For example, large files introduce a hard limit for both getting the size and using it due to fseek and friends working on long int, and malloc dependent on size_t. Then you have concerns about process memory usage by storing the whole file internally. We can go even deeper by considering different encodings which affect your count calculations and how to handle the characters in code.

In general, if your logic starts with "okay, let's read the whole file into memory at once", then the logic is likely flawed. In nearly every case where a file is involved, you can read it in chunks (tokens, lines, records, etc...), and that's a vastly preferred method.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.