mschroeder 251 Bestower of Knowledge Team Colleague

The query issue is because you are not passing the submitted account id into your query. You do on line 57, but not on line 69. Can you explain why you have two separate queries here?

mschroeder 251 Bestower of Knowledge Team Colleague

I did read the entire thread and took the time to read through the ops code and provide a detailed response. You may have provided some php assistance :icon_rolleyes: but you still made the statement as well as another poster that the op should drop php validation in favor of javascript. The problem with this, is that eventually someone else well come across this thread and read that post and think "this is a good idea, i should remove my server-side validation in lieu of javascript!" and then their application will be terribly vulnerable. The posts I quoted are TERRIBLE advice, and I stand by that.

mschroeder 251 Bestower of Knowledge Team Colleague

if( isset($_POST) && !empty($_POST) ){

exit();

}

Seriously this code is of no use. Just delete it and use javascript.

Just Use Javascript Validation and submit its value to the modify_data.php , make a connection in modify_data.php file and retrieve value from previous page using post method . Build a select query on this this page using retrieved values and display data .

This is TERRIBLE advice. Validation should never be done just on the client-side. All it takes is disabling javascript to be able to submit any data you want to the form. Or simply making a post/get request with something like curl.

Make it validate server-side (PHP) first, than worry about javascript validation as this benefits the user and prevents the need to submit the form for trivial issues.

First, have you tried turning error_reporting on? Error reporting is often disabled in environments. Just add error_reporting(E_ALL | E_STRICT); after your <?php tags in your files. This will ensure you are actually seeing EVERY php error even the notices.

Note: line numbers are relevant to your posted code.

Second, line 28 - 31 of modify_data.php If that query doesn't return any results, it will show your message but continue running your code after that. http://php.net/manual/en/function.exit.php - Just one way to terminate execution.

Third, the query on line 33 appears as though it will fetch all of the data in your db since there is no where condition to limit it.

Fourth, …

mschroeder 251 Bestower of Knowledge Team Colleague

Well for starters, your file is technically delimited by the space character. Since your rows are not identified with a unique id, I will assume that the user name is a unique field and thus could be used as your key.

When you build you table, link the usernames to an additional page in the sense of profile.php?user=S_Una

On the profile page, you will need to read through the entire file looking for a line whose first column contains a matching username. This is where flat file databases become slow. The bigger the file the more comparisons that need to occur. If your file is going to remain small then leaving it as you have it is probably sufficient. However, writing data to it via PHP will be more difficult.

//Open the file as an SplFileObject
//This does not read the entire file into memory so it is very efficient
//However you must iterate line by line
$file = new SplFileObject('yourfile.ext');

//Set the file object to be read as a CSV file
//Doesn't matter the file is not comma delimited we will set the delimiter to a space charachter
$file->setFlags(SplFileObject::READ_CSV);

//Change the delimiter
$file->setCsvControl(' ');

//Get the username passed from the url string
$user = $_GET['user'];

//Start iterating over the file line by line
foreach( $file as $line ){
  //OPTIONAL - Change the keys to lowercase so they are identical
  if ( strtolower($user) == strtolower($line[0]) ){
    //If we fine the matching line, set it to $row variable
    //Break …
mschroeder 251 Bestower of Knowledge Team Colleague
$doc = new DOMDocument();
$doc->loadXML('Your XML string');

//Gets the SpecialParameter node value
$specialParameter = $doc->getElementsByTagName('SpecialParameter')->item(0)->nodeValue;

//Gets the SpecialInfo::someNumber attribute value
$specialInfo = $doc->getElementsByTagName('SpecialInfo')->item(0)->getAttribute('someNumber');

This is untested but I believe this should get you in the right direction at the minimum.

iamthwee commented: you beat me to it +15
mschroeder 251 Bestower of Knowledge Team Colleague

1. For starters, anyone who has been around any form of serious web hosting environment will quickly tell you that there is NO such thing as unlimited hosting. There are no unlimited hard drives and there are no unlimited bandwidth connections. At some point there is a limit. Unlimited hosts may be getting better with the proliferation of massive hard drives and fast connections but there will always be a limit somewhere.

These shops tend to target smaller setups who have very small resource footprints and that is fine, but as soon as the resources grow to some, often unwritten, soft limitation you will have problems. This also tends to lead to a lot of overselling. Again if the resources are there it is not a big deal, until hundreds of accounts on the same server start trying to use the same chunk of unlimited bandwidth.

If you value the quality of your service and intend to have your site/service grow to a sizable level then do yourself a favor and steer clear of the allure of unlimited hosting. If you're just using it to get your feet wet and get off the ground it could be a cheap way to get started. You get what you pay for.

2. Video processing is going to require memory and processor resources. These are probably the most expensive and limited resources in a server. If your site sees any kind of steady traffic, you will probably quickly outgrow a …

mschroeder 251 Bestower of Knowledge Team Colleague

There are many different services that provide this kind of functionality.

For example:
http://sendgrid.com/ (One of my favorites)
http://www.streamsend.com/
http://mailchimp.com/

They're all going to charge you for their services though. In return you'll get high deliverability as well as all kinds of analytics. Send Grid has a great set of API's.

There are probably tens if not hundreds of other companies that offer the same kinds of features out there.

mschroeder 251 Bestower of Knowledge Team Colleague

I agree with jkon, the only caveat is needing a browscap.ini which isn't packaged with php by default. This can only be set in the php.ini or httpd.conf files.

mschroeder 251 Bestower of Knowledge Team Colleague

$_FILES["file"]["type"] is supplied by the browser and is not checked by php for accuracy. You can overcome this by using something like the Fileinfo extension to actually check the file's mime type (http://www.php.net/manual/en/book.fileinfo.php)

You could drastically simplify the way that code reads by supplying an array of file types and then checking if the provided is in that array. e.g:

$allowed = array('image/gif','image/jpeg','image/pjpeg','application/pdf','application/msword','application/vnd.ms-excel');

if ( in_array($_FILES["file"]["type"], $allowed) && $_FILES["file"]["size"] < 20000000 )
{...}

Also if you're going to match by the mime you should consider normalizing the value that is provided. At the minimum make it's case consistent e.g. $type = strtolower($_FILES["file"]["type"]);

mschroeder 251 Bestower of Knowledge Team Colleague

While I don't really understand why you got down voted, I will make an assumption it has to do with the lack of effort on your part.

If I was going to approach this, I would probably be looking at the php DOM extension (http://www.php.net/manual/en/book.dom.php).

Using an xPath query (http://www.php.net/manual/en/class.domxpath.php) to return all of my rows <tr></tr>

Then iterate over the collection of rows and checking if each elements nodeName (http://www.php.net/manual/en/class.domnode.php) property is equal to 'th'. If it is I would capture the nodeValue into an array. e.g. $attributes[$element->nodeValue].

Then to get it's value I would use the DOMNode::nextSibling property to get the next adjacent node which should be a td, so check its nodeName is equal to 'td' and then capture its nodeValue as the value of the previous node.

<?php
$doc = new DOMDocument();
$doc->loadHTMLFile("filename.html");
// OR 
//$doc->loadHTML("<html><body>Test<br></body></html>");

$xpath = new DOMXpath( $doc );
$elements = $xpath->query("//tr");

$attributes = array();

//Loop over every row object
foreach( $elements as $row ){

  //Look over each cell in each row
  foreach( $row as $cell ){
    
    //We only care about TH cells so we only look for those specifically
    if( strtoupper($cell->nodeName) == 'TH' ){
      
      //If we find a TH cell then check if we have an adjacent TD cell.
      if( strtoupper($cell->nextSibling->nodeName == 'TD' ){
        
        //Create an array item using the TH's value as the key and the TD's value as the value.  e.g. $attributes['Beds'] = 7
        $attributes[$cell->nodeValue] = $cell->nextSibling->nodeValue;

      }

    }

  } …
mschroeder 251 Bestower of Knowledge Team Colleague

For free any of the the Eclipse variations will work really well. PDT being my personal favorite http://www.eclipse.org/pdt/

However for the money I really prefer to work within Zend Studio.

mschroeder 251 Bestower of Knowledge Team Colleague

All of the documentation indicates that SplFixedArray methods have a >= 5.3.0 requirement. Works without issue on 5.3.5 and 5.3.6 here.

mschroeder 251 Bestower of Knowledge Team Colleague

Two things immediately jump to mind, are short php start tags enabled? Also are you using a version of php > 5.3.0?

mschroeder 251 Bestower of Knowledge Team Colleague

Can you give me a better example of what it is you have and what you're trying to find through a loop? Maybe some code that you're trying to get to work? setcookie('TestCookie', $value) would be read back with $_COOKIE

There isn't a reason to loop over the cookie array unless you don't know what the keys are already. If you don't know the keys you could iterate over the array in a variety of ways:

foreach( $_COOKIE as $key => $value ){
  if( $key == 'TestCookie' ){
    echo $value;
  }

  //OR
  if( $value == 'Some Random Value' ){
    echo $key;
  }

  //etc.
}
mschroeder 251 Bestower of Knowledge Team Colleague

The array is already split. What you're asking doesn't make any sense.

$_COOKIE is an array
e.g. array( key => value, key => value, key => value)

Explode is meant to take a string like "this is a test" and explode it into
array( 0 => this, 1 => is, 2 => a, 3 => test )

If you're trying to turn the array into a string, you want implode.

$string = implode( ' ', $_COOKIE );
//value value1 value2 value3 value4
mschroeder 251 Bestower of Knowledge Team Colleague

$_COOKIE is already an array.

mschroeder 251 Bestower of Knowledge Team Colleague

It is just a little something I put together the end of last year when I was seeing a lot of threads asking what is faster a, b or c and it made it easier to setup controlled tests and run thousands of cycles and see the measured results with little effort.

https://github.com/bkb-mschroeder/Benchmark

mschroeder 251 Bestower of Knowledge Team Colleague
mschroeder 251 Bestower of Knowledge Team Colleague

Hashing is a one-way algorithm it can not be run in reverse.
Encryption is a two-way algorithm where a string and be encrypted and then decrypted.

md5 and sha1 can't be decrypted, but what those sites do, is maintain giant databases of common lookups. So if you make your password 'password' the md5 will always be '5f4dcc3b5aa765d61d8327deb882cf99' which means you can store that and you know that the hash always (minus collisions) matches password.

With hashes it is recommended to always salt the hash with additional random characters that is unique to your site. So if your salt is "!@#$VSA!@#adjk_48ashkj345" no matter how weak someones password is, by default it will be as strong as the salt.

e.g. A user's password of "password" is now "!@#$VSA!@#adjk_48ashkj345password!@#$VSA!@#adjk_48ashkj345" before it gets hashed. This prevents the hashes from being easily matched if your db is compromised, but does nothing if your site is exploited from the frontend where an attacker throws common works at your login fields. This is where rate limits and failed login checks come into play.

This is also a place where it is suggested to make logging in as slow as possible by doing thousands if not hundreds of thousands of hash calculations so you become a much less viable target to attack with automation.

mschroeder 251 Bestower of Knowledge Team Colleague

I believe the link was supposed to be http://ical.mac.com/ical/Portuguese32Holidays.ics

.ics is commonly associated with the iCalendar format and is actually a standard. iCalendar is not an xml format.
xCal is the xml representation of iCalendar format

The content of that file if you open it with a text editor is not in xml format but in the more common iCalendar format.

http://lmgtfy.com/?q=parse+iCalendar%2C+php

mschroeder 251 Bestower of Knowledge Team Colleague

My results are right on par with what you are seeing. Reflection is about twice as slow as using variable variables. I also benchmarked call_user_func and found that to be just about on par with the variable variable calls. I can't say I've seen reflection used as much as I've seen a mixture of variable class and method calls and call_user_func calls. Probably because of the performance hit.

PHP 5.3 added the ability to do variable static method calls $class::$staticMethod() , so I believe this is something that is here to stay in PHP.

Running Benchmark: Compare Variable Variables To Reflection To Call_User_Func
======================================================================
Running Test: test_Variable_Variable
	Cycles: 	 10000
	Mean: 		 0.0000545718 secs.
	Median: 	 0.0000410080 secs.
	Mode: 		 0.0000410080 secs.
	Range: 		 0.0287699699 secs.
	Min: 		 0.0000379086 secs.
	Max: 		 0.0288078785 secs.
======================================================================
Running Test: test_Reflection
	Cycles: 	 10000
	Mean: 		 0.0000928412 secs.
	Median: 	 0.0000720024 secs.
	Mode: 		 0.0000720024 secs.
	Range: 		 0.0295591354 secs.
	Min: 		 0.0000679493 secs.
	Max: 		 0.0296270847 secs.
======================================================================
Running Test: test_Call_User_Func
	Cycles: 	 10000
	Mean: 		 0.0000557658 secs.
	Median: 	 0.0000460148 secs.
	Mode: 		 0.0000460148 secs.
	Range: 		 0.0230920315 secs.
	Min: 		 0.0000429153 secs.
	Max: 		 0.0231349468 secs.
======================================================================
mschroeder 251 Bestower of Knowledge Team Colleague

This can be done by combining a few SPL iterators.

<?php

$iterator = new RegexIterator( 
	new RecursiveIteratorIterator( 
		new RecursiveDirectoryIterator('/path/to/your/directory/') 
	), 
	'/.*\.css$/'
);

foreach( $iterator  as $file ){
	echo $file.PHP_EOL;
}

Where $file in the foreach loop will be an instance of SplFileInfo.

You could also skip the regex iterator and create your own filter iterator that extends FilterIterator but you'd have to write your own accept() method.

mschroeder 251 Bestower of Knowledge Team Colleague

I have personally used http://www.plupload.com/ with a 250MB/file upload limit with no issues. It will break the file up into pieces and then your script will need to reassemble the temporary uploads. Works really well from my experiences.

mschroeder 251 Bestower of Knowledge Team Colleague

No, classes can only extend one class, but they can implement multiple interfaces.

mschroeder 251 Bestower of Knowledge Team Colleague

Captchas are used to prevent the automated submission of your forms. To prevent things like comment spam, automated signups etc.

mschroeder 251 Bestower of Knowledge Team Colleague

There are major difference between the Mysql and Mysqli extensions.

Unless you're using MySQL < 4.1 or PHP 4 you should be using the mysqli extension, even if you're using the procedural version of it. There is NO reason to be using the mysql extension, it is not receiving active development (only maintenance) releases and it can not utilize any of the new MySQL features since 4.1 (5.0, 5.1, and 5.5 have all been stable releases since), such as character sets, prepared statements, multiple statements, transactions, enhanced debugging, embedded server support etc.

Mysqli is also the suggested extension to be used for all new development.

mschroeder 251 Bestower of Knowledge Team Colleague

The actual countdown functionality is not php it is javascript.
If you were going to implement this, on the page where you pull a list of auctions, calculate the remaining time by subtracting now from the end date/time.

In your page you will need some JavaScript mechanism to take this time and use it to start a countdown that matches.

Doubt you'll find a drop in solution that meets your requirements, but for reference here are some examples:

http://keith-wood.name/countdown.html
http://1plusdesign.com/articles/add-countdown-timer-website/

mschroeder 251 Bestower of Knowledge Team Colleague

If your hosting company is telling you their version of php does not support the mysqli extension then you should quickly find a new host. This would indicate to me either they are still running php 4, php 5 was release in july 2004, or they are to stubborn/lazy to enable what I would consider a standard php extension at this point, it is included with PHP 5 by default.

Mysqli provides many benefits over the mysql extension and they are not equals in any way. That is of course unless your host is also running a very old version of mysql as well (e.g. < 4.1). This considering 5.0, 5.1, 5.5 are all stable releases and 5.6 is well on it's way to becoming the next stable release.

Mysql is not actively developed, only minor maintenance releases. Mysqli is actively developed, recommended by MySQL as the preferred development extension, supports character sets, prepared statements, stored procedures, and multiple statements just for starters. It also supports the additional features and options in MySQL 4.1+, which the mysql extension does not.

mschroeder 251 Bestower of Knowledge Team Colleague

Objects which can not be serialized or are composed of objects that can't be serialized will cause headaches with your sessions for starters.

In my experience there is a bigger performance hit for the server to serialize and unserialize session data than there is to recreate the objects from persistent storage, especially if the object collection grows and grows.

Stefano Mtangoo commented: Very useful! +6
mschroeder 251 Bestower of Knowledge Team Colleague
mschroeder 251 Bestower of Knowledge Team Colleague

Thats it.

mschroeder 251 Bestower of Knowledge Team Colleague

There probably isn't a reason to actually delete records.

On the query you use to pull the comments you should add an ORDER BY clause that sorts them by the date they were added to the database (assuming you have a column for this) and then add a LIMIT clause (http://dev.mysql.com/doc/refman/5.1/en/select.html#id848826) so that only the most recent 10 results are shown.

mschroeder 251 Bestower of Knowledge Team Colleague

Depends, session_start needs to be called before any output occurs or you'll get an error.

If your header include contains everything that would be between the header tags on the page, what is creating the container html? If your code looks like the following and you put the session start in the header include, it will error out, as the html is being sent to the browser before the session starts.

<html>
<head>
<?php include('header'); ?>
</head>
<body>
</body>
</html>
mschroeder 251 Bestower of Knowledge Team Colleague

You should have started a new thread for starters, but your problem is rather simple.
You're development environment is running at least php 5.3.0, which deprecated the session_is_registered function. (http://php.net/manual/en/function.session-is-registered.php)

Anywhere you have if( session_is_registerd('key') ) replace it with if( isset( $_SESSION ) )

session_register is also deprecated btw.(http://www.php.net/manual/en/function.session-register.php)

mschroeder 251 Bestower of Knowledge Team Colleague
<?php

$count=1;

$disk = array(
    'test1',
    'test2',
    'test3',
);

foreach ($disk as $device)
{
    $var = 'strDisk'.$count;
    $$var = "<dataset seriesName='$device'>";
    $count++;
}

var_dump( get_defined_vars() );

if you look through the variable output you should see:

'strDisk1' => string '<dataset seriesName='test1'>' (length=28)
      'strDisk2' => string '<dataset seriesName='test2'>' (length=28)
      'strDisk3' => string '<dataset seriesName='test3'>' (length=28)

*Note* var_dump output is formatted by the xdebug plugin.

mschroeder 251 Bestower of Knowledge Team Colleague

If the quantity ALWAYS has a , after it you can just take the chunk that is before the comma.

$item = '1,';
$item = substr($item, 0, -1);
echo $item; //1

You could also use str_replace to replace all occurrences of commas.

$item = '1,';
$item = str_replace(',', '', $item);
echo $item; //1
mschroeder 251 Bestower of Knowledge Team Colleague

I'm not sure if I follow, but i'll take a shot at it.

Assuming $id from your query result is the ID-QUANTITY value, then you would pass $id as the second parameter of explode. The explode function would "explode" it apart at the dash (the first parameter) and you'd be left with an array, with two values.

$id = $result[0];
$quantity = $result[1];
mschroeder 251 Bestower of Knowledge Team Colleague

If your string is always going to be in the format of ID-QUANTITY explode() would be faster and removes the need to use a regular expression.

<?php
$value = '1234-57';
$result = explode('-', $value);

echo $result[0].PHP_EOL; //1234
echo $result[1].PHP_EOL; //57
mschroeder 251 Bestower of Knowledge Team Colleague

I assume you want functionality like wordpress, where as you type into your title, the url is generated from it and displayed for the user.

While this could be achieved with pure javascript, I think your best solution will be to use an ajax request, so it can check the database if the url is already taken.

I'm just going to reference some jquery stuff as you can try to put the pieces together before we go any further, but essentially, the user enters the headline and on a .focusout() event (when they leave that input field for something else) you take whats in the headline field (See jQuery selectors) and make an ajax GET request to a php script which returns json (for this example) .

The php script, filters the headline to Alphanumeric and spaces. Then replaces the spaces with dashes and then lowercases the entire string. (Filter the string first as this will prevent any multi-byte issues)

<?php

$string = $_GET['headline']; //From the ajax get request. Should produce this-is-my-test-headline
$string = preg_replace( '/[^0-9a-zA-Z ]/i', '', $string );
$string = str_replace( ' ', '-', $string );
$string = strtolower($string);

/* 
Do a database lookup and return a count() value
e.g.

$stmt = $dbh->prepare("SELECT COUNT(ThreadId) AS ThreadCount WHERE Url = ?");
if ( $stmt->execute( $string ) ) {
  //Should only return ONE row
  $row = $stmt->fetch();
  if( $row['ThreadCount'] != '0' ){
	//Tack on ThreadCount + 1 to the headline
	$string …
mschroeder 251 Bestower of Knowledge Team Colleague

@chrishea
I have a development server where projects get subdomains configured as virtual hosts all other unmatched subdomains resolve to a single catch-all vhost. I don't think it is terribly uncommon just depends on the usage.

@tcollins
If you call <?php phpinfo(); ?> from that file and look though the results for HTTP_HOST and it's value, you will probably see that the HTTP_HOST is test.domain.com or fail.domain.com, not just the subdomain portion.

mschroeder 251 Bestower of Knowledge Team Colleague

Having the value in a hidden input field would be no different than the user selecting it from a drop-down where it is also visible in the source.

If you inspect the source of most web forms you're going to commonly find the use of hidden input fields. This is why filtering/validating input and escaping your output is so important.

kekkaishi commented: exactly. +2
mschroeder 251 Bestower of Knowledge Team Colleague

If you want to keep the flow of your application at it is now, when the user clicks on the "Rate this Cigar" button and the cigar id is passed to the second page via a GET value e.g. ?CigarId=####

Then on the second page, if that get value is present, swap out the dropdown for the text field with the information and add in a hidden field with the same name as your drop-down.

When the form is posted it will process the form and the $_POST array will contain $_POST just as if it had been selected in the drop-down.

mschroeder 251 Bestower of Knowledge Team Colleague
//Check User's Last Review
$SQL = "SELECT COUNT(CigarID) AS ReviewCount FROM reviews_cigar WHERE UserID = $UserID AND CigarID = $CigarID AND DATE_SUB(CURDATE(),INTERVAL 30 DAY) <= DateAdded";
$Result = mysql_query($SQL);

//Error out if the query fails for some reason
if( !$Result ){
 die( mysql_error() );
}

//Query should ALWAYS return 1 row.
//Reviews in last 30 days : ReviewCount = 1+
//No Reviews in last 30 days : ReviewCount = 0
$row = mysql_fetch_array( $Result );
if( $row['ReviewCount'] > 0 ){
  die('Sorry but you have submitted a review for this cigar in the past 30 days');
}

You could also drop the count() portion of the query and return a field for every review the user has created in the last 30 days.
Then implement similar functionality using if( mysql_num_rows( $Result ) > 0 ){ die(); }

mschroeder 251 Bestower of Knowledge Team Colleague

Assuming DateAdded is a Date or DateTime column in mysql.

SELECT COUNT(CigarID) AS ReviewCount FROM reviews_cigar WHERE $UserID = UserID AND $CigarID = CigarID AND WHERE DATE_SUB(CURDATE(),INTERVAL 30 DAY) <= DateAdded

This should find any rows that have been added within the last 30 days of the DateAdded column.
**You may not want to use the COUNT() like I did because it will always return 1 row even when the count is 0.**

mschroeder 251 Bestower of Knowledge Team Colleague

You would need to make a GET request (retrieve the api url) for every address. There is no bulk API that I am aware of.

FYI, I looked at the json a little closer and realized I posted an array example earlier. The proper way to access the lat and long return values is: $object->results[0]->geometry->location->lat and $object->results[0]->geometry->location->lng where $object is the return from json_decode();

To do this via a cron script you would:

Retrieve new addresses from database (limit at 2500 because of api)
Loop over those results
For each address:
  Request the geocode api (file_get_contents or a variant)
  Process the json response
  Update the address record with the lat and lng

The thing to be aware of, is how you are calling your php script from cron.
If you are calling it via the command line, e.g php /path/to/your/file.php the script will have its max_execution_time set to 0 so it will run until it is completed.

If you are calling the script via something like wget or curl you will need to be conscience of the maximum execution time. http://php.net/manual/en/function.set-time-limit.php

mschroeder 251 Bestower of Knowledge Team Colleague

You would want to use the Geocoding API: http://code.google.com/apis/maps/documentation/geocoding/

This could be as simple as capturing the output of http://maps.googleapis.com/maps/api/geocode/json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA&sensor=false
with file_get_contents, and then using json_decode on the result to get a multi-dimensional array. If you put the url above into your browser you'll see the json response returned.

<?php
$json = file_get_content('http://maps.googleapis.com/maps/api/geocode/json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA&sensor=false');

$array = json_decode($json);

echo $array['results']['geometry']['location']['lat'].PHP_EOL;
echo $array['results']['geometry']['location']['lng'].PHP_EOL;

However if you read through the api docs, there is a 2500 request limit per day.

I see three ways you could implement the automation of this.

First, you could have a single cron script that just pulls all locations with a limit of 2500 and then processes those every night.

Second, you could actually do the request when the user submits an address initially. The issues with the second option would be the time it takes to complete the external request as the user would need to wait for that to happen before the record would be saved.

Third, you could use a queue, where when the user adds an address, instead of doing the request directly, it is added to a queue to be picked up later, and on a timed cycle, a cron script picks up only the items in queue and processes them.

Queueing is a great way to offset these kinds of tasks like sending emails, geocoding and other background tasks and there are lots of ways to handle queues.

mschroeder 251 Bestower of Knowledge Team Colleague

PHP's DateTime (http://www.php.net/manual/en/book.datetime.php) object accepts the MySQL DateTime format natively.

<?php
$dt = '2038-12-31 23:59:59';
$dtObj = new DateTime( $dt );
echo $dtObj->format('h:i:s a').PHP_EOL;

$ss = strtotime($dt);
echo date("h:i:s", $ss).PHP_EOL;
11:59:59 pm
00:00:00

Keep in mind PHP's strtotime and date functions are not compatible with dates beyond Jan 19, 2038. (http://en.wikipedia.org/wiki/Year_2038_problem) e.g. The end date of a 30 year mortgage from today etc.

This has been resolved in the DateTime class since PHP 5.2

mschroeder 251 Bestower of Knowledge Team Colleague

Unless you are benchmarking your code and load testing it for these kind of slow downs while you're developing, you're really not going to gain much by adding caching for the sake of adding caching. Caching is part of a scaling plan, and you can't tell where your code needs to scale until you know where you code fails and why.

The functionality you described is exactly what every dynamic database driven piece of code does. These slides might be of interest http://www.slideshare.net/JustinCarmony/effectice-caching-w-php-caching. Although the slides talk about memcache specifically the concepts they cover are relevant to all forms of cache storage.

mschroeder 251 Bestower of Knowledge Team Colleague

-veledrom

I'm not sure I really understand the utility in caching the way you have described it. How have you arrived at the conclusion the database will slow down or give up if you don't use caching?

Sounds to me like you're trying to over optimize your code before you see where the code is going to bottleneck in the first place.

mschroeder 251 Bestower of Knowledge Team Colleague

-veledrom

My suggestion to you, if you want to learn how to code well then you should be looking at well coded examples.

Symfony's File Cache (Many others in their package as well):
http://www.symfony-project.org/api/1_4/sfFileCache
Examples:
http://snippets.symfony-project.org/snippet/110
http://snippets.symfony-project.org/snippet/99

Zend Cache
http://framework.zend.com/manual/en/zend.cache.html
Code:
http://framework.zend.com/svn/framework/standard/branches/release-1.11/library/Zend/Cache.php
http://framework.zend.com/svn/framework/standard/branches/release-1.11/library/Zend/Cache/

CakePHP Cache
http://api.cakephp.org/class/cache

You'll see they are all very different but generally similar to one another. I would use these to try and put something of your own together as they'll illustrate lots of best practices and be well documented.

-ardav

Caching is kind of an open ended question. In my experiences the places where you think you need cache are usually not the places that benchmark poorly. Rule of thumb would be to cache external resources, e.g. if you're displaying daniweb's rss feed you wouldn't use file_get_contents, simplexml, dom, etc. to parse it every time a page loads you'd call that only if your local cache didn't exist and since you're caching that you might as well cache the rendered results before their displayed as they won't be changing either. Then you're doing almost no actual work to display that remote feed until the cache is expired, say every hour or two would be practical.

Some areas that I've also seen improved with caching would be large static objects. Like a site wide ACL. Where unless you're making changes to the acl …