I have a database which is quite large (around 1.7 million rows, with about 40 fields) . I want to do about 10 different queries and echo these results out in a tables that are spaced with headings for the seperate results, but i am worried about the length of time it will take to show these.

How would I make these 10 queries give a result faster than using the example code 10 times below each other with slightly different field names.

An example of 1 of the queries is below, all queries come from the same database and table just slightly different fields.

How would I make these 10 queries give faster results.

$result = mysql_query("SELECT * FROM table WHERE `Field.1` = '$test' and `Field.2` > $test1 ORDER BY `Field.3` + 0 LIMIT 3") 
or die(mysql_error());  

echo "<table class = sample >";
echo "<tr> <th>h1</th> <th>h2</th> <th>h3</th> <th>h4</th> <th>h5</th> <th>h6</th> <th>h7</th></tr>";
// keeps getting the next row until there are no more to get
while($row = mysql_fetch_array( $result )) {
	// Print out the contents of each row into a table
	echo "<tr><td>";
	echo $row['h1'];
	echo "</td><td>";
        echo $row['h2'];
	echo "</td><td>"; 
	echo $row['h3'];
	echo "</td><td>";
	echo $row['h3'];
	echo "</td><td>";
	echo $row['h4'];
	echo "</td><td>";
	echo $row['h5'];
	echo "</td><td>";
	echo $row['h6'];
	echo "</td><td>";
	echo $row['h7'];
	echo "</td><tr>";

echo "</table>";
7 Years
Discussion Span
Last Post by mpc123


I have never worked with 1.7 million rows before but I know that to keep a database fast you will need to optimize it. You should have a CronJob / scheduled task to optimize that database about twice a day.

I don't think having 10 queries will slow it down too much. MySQL is fast (well, 1.7 million rows would slow it down). You could always use 1 query and put the data into an array BUT that you absolutely destroy PHP with that amount of data. (1.7 million rows!!!!!).

Does your database have to be that big? Can you not split it into different tables?

Kieran :)

Edited by Kieran Y5: n/a


Yeah i could split it up but i have to download the updates once per week ,so presume would have to get some script to split in up in some kind of order and then rewrite the web side to go from the different databases, which sounds a big job to me. How would i convert that in to 1 array before i try spliting the databases

Thanks for your reply

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.