Hi!

Having a hard time figuring out how to solve this problem at hand. basically i have several tables that are inter connected in some manners but i need to full all this data and format it in a manner that it will show the total number of trips and sum the total rate per plate number.

I'm using Inner join to put together all the table but somehow i am getting duplicate rows that mess with my grandtotal computation.

Here is my query get the total trips and sum

SELECT DISTINCT dglobal.`vehicle_id`,dglobal.`plate_no` ,dglobal.make, dglobal.model, 
    COUNT( DISTINCT pymnt.`reference_no` ) AS bilang,
    SUM(pymnt.grand_total) AS total
    FROM transp36_transportpal_global.vehicle_info dglobal
INNER JOIN transp36_transportpal_$dbname.vehicle_details vdetail ON dglobal.`vehicle_id`=vdetail.`vehicle_id` 
INNER JOIN transp36_transportpal_$dbname.reservation_details res ON vdetail.`reference_no`=res.`reference_no` 
INNER JOIN transp36_transportpal_$dbname.payment_details pymnt ON vdetail.`reference_no` = pymnt.`reference_no` 
WHERE res.rental_date BETWEEN '$from' AND '$to' 
AND res.status='Served' 
AND pymnt.payment_type NOT IN ('Complimentary','Non Revenue Ticket') 
GROUP BY dglobal.`vehicle_id`

Note : the problem is i have a duplicate in vehicle_details table. already tried using distinct but still not working.

This is the output of my script.

https://s31.postimg.org/lpdyr1bcb/Untitled.png
and https://s31.postimg.org/5d4dkpvyz/image.png

Hope you can help me.

Thanks Neil!

Try including plateNo, Make, and Model into your group by.

Im not sure how you would have multiple matching rows, but this may be your issue :-/

Still getting the duplicate record. instead on adding only once the grand total for the specific transaction its being summed X2 because of the duplicate record on vehicle_details table. how can i prevent my query or ignore to count the duplicate record?

Im using vdetail.reference_no = pymnt.reference_no for the connection

Well.. if you have a duplicate, what exactly makes them different from each other so they are atomic?

If they are exact duplicates, then you will also have to MIN or MAX on your index to single one of them out, or fix your data :-/

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.