i have been given an assignment, to find frequencies of all words in a large text file. I have tried a program which finds the same in a sample string. Done by taking that string in an array. But in case of a text file spanning many pages with thousands of words, won't that array eat up a lot space? I have been asked to consider performance as a prime criteria.
Any suggestion will be awesome
Shantanu88d
0
Light Poster
Recommended Answers
Jump to PostI think this would work reasonably fast:
<?php $filename = "/path/to/file.txt"; $handle = fopen($filename,"r"); if ($handle === false) { exit; } $word = ""; while (false !== ($letter = fgetc($handle))) { if ($letter == ' ') { $results[$word]++; $word = ""; } else { $word .= $letter; …
Jump to PostYou can always put your results in a mysql table, if your worried that memory will become a problem.
Jump to PostYou came here for a suggestion and got a complete solution. Lucky guy.
All 10 Replies
edwinhermann
57
Junior Poster
pritaeas
2,194
¯\_(ツ)_/¯
Moderator
Featured Poster
diafol
edwinhermann
57
Junior Poster
diafol
diafol
Shantanu88d
0
Light Poster
Shantanu88d
0
Light Poster
khan12345
0
Newbie Poster
edwinhermann
57
Junior Poster
Be a part of the DaniWeb community
We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.