Hi,
I have a large file (11 GB), that I want to extract information from. I decided that the file was to big to work with, so I ended up with splitting it into 20 smaller files. Now I don't know what the smartest thing is to do with these files. Are they still to big? I want to open them and read them, but I was thinking that its to time demanding to do so that many times. I'm pretty new to python and now I'm not sure on how to proceed.
sofia85
0
Junior Poster in Training
Recommended Answers
Jump to PostIt depends what you want the file. Generally the way to work with large file is split and merge or you maybe do not need split if you work with generator expressions not loading all data in memory at once. So could you specify the processing you are doing to …
Jump to PostIf you are unsure about how much memory you need maybe you can separate the wanted part from file to another file:
def get_prob(line): before, prob, after = line.partition('prob=') if prob: return after.partition(';') [0] with open('result.txt', 'w') as out_file, open('data_sofia.txt') as in_file: for line in in_file: prob …
All 5 Replies
TrustyTony
888
pyMod
Team Colleague
Featured Poster
sofia85
0
Junior Poster in Training
TrustyTony
888
pyMod
Team Colleague
Featured Poster
~s.o.s~
2,560
Failure as a human
Team Colleague
Featured Poster
sofia85
0
Junior Poster in Training
Be a part of the DaniWeb community
We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.