In Function get_temp_urls located in spider.php a sql gets executed. The results of this sql get stored in an Array named $tmp_urls.
Limit the SQL results to say 0,100 and your spider will work fine as the Array is kept small. Now, if you think the spider will stop after 100 URLs, then you are wrong, he continues way as long as there is something to spider (depending on your settings)!
The SQL I use reads:
$result = mysql_query("select link from ".$mysql_table_prefix."temp where id='$sessid' limit 0,10");
This also speeds up the overhead before the spider starts and limits queries against the database!
Try it and be surprised!