#include <ctype.h>
  #include <stdlib.h>
 #include <stdio.h> 
#include <string.h>

 char c;

{
char text[1000];
FILE *fp=fopen("filename", "r");
int i=0;
while((c=fgetc(c))!=EOF)
 {
  do something
}
}

Recommended Answers

All 2 Replies

As I recall you should use fseek and ftell to get the file size, them malloc to allocate enough space to hold the entire file, then rewind and fread to read the entire file at once.

As I recall you should use fseek and ftell to get the file size[...]

That's fine for simple use cases, but it gets troublesome very quickly. For example, large files introduce a hard limit for both getting the size and using it due to fseek and friends working on long int, and malloc dependent on size_t. Then you have concerns about process memory usage by storing the whole file internally. We can go even deeper by considering different encodings which affect your count calculations and how to handle the characters in code.

In general, if your logic starts with "okay, let's read the whole file into memory at once", then the logic is likely flawed. In nearly every case where a file is involved, you can read it in chunks (tokens, lines, records, etc...), and that's a vastly preferred method.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.