0

I have a (2gb)txt file which has thousands of queries, i need to read those file queries and split those queries into n no. of txt files..and save them in a folder... can any one post an example or guide

Edited by pritaeas: Moved to PHP.

4
Contributors
11
Replies
47
Views
3 Years
Discussion Span
Last Post by chaitu11
Featured Replies
  • To achive this you need to make sure that the each query is written in a single line; so that we can convert each line into an array index. Example: $arrayQuery[0] = 'Here goes the first query' $arrayQuery[1] = 'Here goes the second query' Following is the code: <?php $arrayQuery … Read More

  • It sounds like you have a file with 10000+ lines in it, that you need to split into 50 line chunks and save to new files... If the file is well over your php memory limit, reading the whole thing into memory will only cause you headaches. I'd highly suggest … Read More

0

Splitting large File into Parts in PHP

Edited by chaitu11

1

To achive this you need to make sure that the each query is written in a single line; so that we can convert each line into an array index.
Example:
$arrayQuery[0] = 'Here goes the first query'
$arrayQuery[1] = 'Here goes the second query'

Following is the code:

<?php
$arrayQuery = file('filename.txt');


$newFilePrefix = 'Query';
$fileNumber = 0;

foreach($arrayQuery as $oneQuery) {

    $newNumber = $fileNumber++;
    $handler = fopen($newFilePrefix.$newNumber.'.txt', 'w');
    if(fwrite($handler,$oneQuery)) {
        echo 'Query write in file: '.$newFilePrefix.$newNumber.'.txt'.'<br>';
    } else {
        echo 'Error while writing query in: '.$newFilePrefix.$newNumber.'.txt'.'<br>';
    }
    fclose($handler);
}

?>

Please change filename.txt to your file name.

0

how can i read every 50 out of 10000 lines from a text file.using php

Edited by chaitu11

0

Use a counter like $newNumber in the example above. After every N lines save the file and create a new one.

0

Following is the code to split 50 lines from a large file.

<?php
$arrayQuery = file('sample.txt');

$multiChunk = array_chunk($arrayQuery, 50); // 50 line per file

$newFilePrefix = 'Part';
$fileNumber = 0;

foreach ($multiChunk as $chunk) {

    $fileNumber++;
    $newFileName = $newFilePrefix . $fileNumber . '.txt';
    $handler = fopen($newFileName, 'w');
    $status = array();
    foreach ($chunk as $line) {
        if(fwrite($handler, $line)) {
            $status[] = 1;
        } else {
            $status[] = 0;
        }
    }

    if(in_array(0,$status)) {
        echo 'Something goes wrong in '.$newFileName.' file<br>';
    } else {
        echo 'Data successfully write '.$newFileName .' file<br>';
    }

    fclose($handler);
}
?>
1

It sounds like you have a file with 10000+ lines in it, that you need to split into 50 line chunks and save to new files...

If the file is well over your php memory limit, reading the whole thing into memory will only cause you headaches.

I'd highly suggest using SplFileObject to iterate over the file line by line. This will give you a much smaller memory footprint.

Create a blank file also using SplFileObject, iterate over 50 lines of the original file, writing each line to the new blank file.
Keep a counter of how many lines you've read through. When you've reached 50, simply increment the blank file you're using.

I'll be happy to share code examples.

This question has already been answered. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.