I have a (2gb)txt file which has thousands of queries, i need to read those file queries and split those queries into n no. of txt files..and save them in a folder... can any one post an example or guide

Recommended Answers

All 11 Replies

Splitting large File into Parts in PHP

i want it in php

To achive this you need to make sure that the each query is written in a single line; so that we can convert each line into an array index.
Example:
$arrayQuery[0] = 'Here goes the first query'
$arrayQuery[1] = 'Here goes the second query'

Following is the code:

<?php
$arrayQuery = file('filename.txt');


$newFilePrefix = 'Query';
$fileNumber = 0;

foreach($arrayQuery as $oneQuery) {

    $newNumber = $fileNumber++;
    $handler = fopen($newFilePrefix.$newNumber.'.txt', 'w');
    if(fwrite($handler,$oneQuery)) {
        echo 'Query write in file: '.$newFilePrefix.$newNumber.'.txt'.'<br>';
    } else {
        echo 'Error while writing query in: '.$newFilePrefix.$newNumber.'.txt'.'<br>';
    }
    fclose($handler);
}

?>

Please change filename.txt to your file name.

how can i read every 50 out of 10000 lines from a text file.using php

Use a counter like $newNumber in the example above. After every N lines save the file and create a new one.

Following is the code to split 50 lines from a large file.

<?php
$arrayQuery = file('sample.txt');

$multiChunk = array_chunk($arrayQuery, 50); // 50 line per file

$newFilePrefix = 'Part';
$fileNumber = 0;

foreach ($multiChunk as $chunk) {

    $fileNumber++;
    $newFileName = $newFilePrefix . $fileNumber . '.txt';
    $handler = fopen($newFileName, 'w');
    $status = array();
    foreach ($chunk as $line) {
        if(fwrite($handler, $line)) {
            $status[] = 1;
        } else {
            $status[] = 0;
        }
    }

    if(in_array(0,$status)) {
        echo 'Something goes wrong in '.$newFileName.' file<br>';
    } else {
        echo 'Data successfully write '.$newFileName .' file<br>';
    }

    fclose($handler);
}
?>

@peeyush.budhia: Spoon-feeding is not recommended. He'll learn nothing.

It sounds like you have a file with 10000+ lines in it, that you need to split into 50 line chunks and save to new files...

If the file is well over your php memory limit, reading the whole thing into memory will only cause you headaches.

I'd highly suggest using SplFileObject to iterate over the file line by line. This will give you a much smaller memory footprint.

Create a blank file also using SplFileObject, iterate over 50 lines of the original file, writing each line to the new blank file.
Keep a counter of how many lines you've read through. When you've reached 50, simply increment the blank file you're using.

I'll be happy to share code examples.

Thanks all

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.