Hi,

I have a problem duplicate entry in same time. I am getting some parameters in other gateway (some websites) and stored in Database. I trim all parameters. Before i store in db, i check db whether that record already exist or not.

Other Gateway website which they send three time to us. First time i capturing all the parameters and check it db and stored, if not already exist.

But sometime stored duplicate entry in same hours,minutes and seconds also. This cause, i can not check db also.

Same record stored in two or three row in table.

Still i have a headache how solve this problem. Please give me any idea to solve this issues.

I am using php and mysql.

ex:

Sample record as in table Mysql

you can see the date. time, min and seconds everything is same :(

e_id, name, amount, date.

e_id - > this is unique and auto increment.

e_id name amount Date

125 test 150 2011-03-19 09:46:38
126 test 150 2011-03-19 09:46:38
127 test 150 2011-03-19 09:46:38

Thanks

William

I don't know if it is possible but maybe there is a little lag in the update to the database or the time that it takes the script to finish processing? If that is the case then it also maybe the case that a browser is hitting the address multiple times. I know that this happens with pixels. When tracking traffic with pixels and I will see firefox often firing a pixel two or three times which is why I end up having to control this with session.

You could do the same. Create a 5 or 10 second pause between submissions with session by storing the previous timestamp in a session variable and compare that to future submissions. If a submission is done twice in the same session you can either decide that you want to allow multiple submissions or compare the timestamp to determine that amount of time since the last submission and force a x number of second pause between same browser/user submissions.

If this solves the problem then you know what is going on. Either the browser is doing multiple hits to the address like they are known to do with pixels, or maybe a user is submitting multiple times before the initial submission has the opportunity to finish. If this doesn't solve the problem then I don't know what but at least that will eliminate this as an option.

Another thing you can do is write to the error log something unique about this specific submission and then look to see if it is being duplicated multiple times within a second.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.