Hi,
I have a large file (11 GB), that I want to extract information from. I decided that the file was to big to work with, so I ended up with splitting it into 20 smaller files. Now I don't know what the smartest thing is to do with these files. Are they still to big? I want to open them and read them, but I was thinking that its to time demanding to do so that many times. I'm pretty new to python and now I'm not sure on how to proceed.
sofia85 0 Junior Poster in Training
Recommended Answers
Jump to PostIt depends what you want the file. Generally the way to work with large file is split and merge or you maybe do not need split if you work with generator expressions not loading all data in memory at once. So could you specify the processing you are doing to …
Jump to PostI have a large file (11 GB), that I want to extract information from.
At the risk of posting some off-topic, is using Python an absolute requirement? If not, and assuming you are on *nix, you can easily extract the probability using a one-liner:
cat tab.txt | …
All 5 Replies
TrustyTony 888 ex-Moderator Team Colleague Featured Poster
sofia85 0 Junior Poster in Training
TrustyTony 888 ex-Moderator Team Colleague Featured Poster
~s.o.s~ 2,560 Failure as a human Team Colleague Featured Poster
sofia85 0 Junior Poster in Training
Be a part of the DaniWeb community
We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.