I need to be able to read and write huge files that nearly weighs 20+GB and I need to do it fast. Any suggestions or tips? Thanks.

1) Use 64-bit version of VC++, I think its version of fstreams support huge files

2) call win32 api read/write functions which support huge files (see ReadFile() and WriteFile())

How you do it depends upon whether you just need serial input/output, or if you need some random access to the data. If you need random access to the data, then as Ancient Dragon said, you need support for files >4GB (maximum a 32-bit offset can handle. Current Linux systems support access to very large files, for both 32-bit and 64-bit systems by defining the type fpos_t appropriately.

I am trying to read all of the files, encode it, and write it. I need to it in a fast and efficient way. Thanks.

Here is a I/O profiling link and here is a link to some info on memory mapping file I/O that may be of interest to you.