Can anyone tell me how a self driving car will park itself on the parking of a supermarket or on an airport?

Parking in a bit parking lot should be easy, just give the human occupants two choices:
1) fastest parking : take the first open spot you find.
2) closest parking : take the open spot closest to the shop you told it to drive you to.

Harder would be parking in a big city where there is location, busy-ness and price to balance - especially since price will depend on time the lot that is cheapest for full day parking might be one of the most expensive for 30minute parking.

Likely, if done right - the car will just drop you off at the door and come back when you call it. The decision to find a place to idle/park or go into a loop will be one of necessity (I need to recharge / nowhere to loop) or convenience (owner flagged it as a 15 minute stop, so drive around the block a few times until past a threshold and then park).

Can anyone tell me how a self driving car will park itself on the parking of a supermarket or on an airport?

Easy. I wrote much of the software for doing that. It's a solved problem. The decision making part is just an A-star on a road network graph (trivially expanded to task planning), derived from sat-nav maps (e.g., google maps on steroids). The actual parking maneuvers can be accomplished in a number of ways, typically, a variant of a lattice motion planner or a kinodynamic sampling-based planner.

And there would be no reason for you to waste your time waiting for the car to find a place to park. It just drops you off where you need to go.

(insert ethical question about self-driving cars)

When working in this field, the question I got most often was "if the car can either hit a grandma or a baby carriage, will it know what to do?" or some variation of that. For one, this assumes that self-driving cars can make out the difference between the two, which I don't know of any system that could (there are no magic sensors that can reliably do that). The working principle for the various standards that govern this issue in the automotive industry is that the priority is (1) a way to save everyone, (2) a way to save everyone else except the passengers of the car (because they accepted the risks, others have not), and (3) a way to minimize the harm to others (e.g., brake as much as possible, to minimize speed of collision). I think that's reasonable.

Generally, reaction times make it so that roughly speaking, for every thousand car accidents, only a few are inevitable (i.e., there is no action that can prevent it). Most accidents could be prevented if people reacted faster / better. In my experience, the worst factor at play is the fact that if you design the software to always take the safest course of action, you have a really hard time getting the car to do anything. Most driving on the road is so far beyond any acceptable level of risk (according to any standards that govern that sort of stuff) that people don't know where to set the bar to match the level of risk that normal people take willingly when they get into a car.

Will self driving cars ever make it?

For commercial services, yes, definitely. Things like self-driving taxi services in urban areas and commercial freight services don't have any major blockers as far as I can see, and major financial incentives. User acceptability is yet to be seen to some extent, but not a major blocker, mostly a matter of fine-tuning and PR.

For personal cars (e.g., like Tesla's autopilot minus the life-threatening sub-standard quality of their system), the technical hurdles are a bit bigger given the cost and design constraints (nobody wants to pay 10,000$ in options for a car with ugly sensors everywhere for the few times they would actually use its self-driving capabilities). Definitely in the realm of possible, though.

commented: must be fun up in that head of yours... +0

Has anyone solved the "unexpected road conditions" issue yet? - e.g. snow covering road lines, pot holes, fallen branches or construction work partially blocking the road. Or in situations where traffic lights are out or police are directing traffic.

Considering that the electronic door locking/unlocking mechanism of one car my family owned would break whenever we had -20 weather or freezing rain, I'm a bit skeptical of self-driving cars where weather & roads aren't always ideal.

@Agilemind. -20F didn't work out so well for horses and people back then either. They were however smart enough to just forget travel that day or night.

snow covering road lines

That's less of an issue for self-driving cars than for human drivers. Humans rely a lot on the road markings to position the car, self-driving cars don't need them as much, if at all, unless they we recently shifted around (e.g., construction).

A fresh snow cover is more of a challenge on some sensors, like the lack of contrast (everything is white) makes it hard for vision systems (stereo, mono, optical flow), and lidars also have trouble with returns from snowflakes and generally higher reflectivity. With proper redundancy, this can be overcome.

Snow accumulation is another problem because it makes the world look very different from what it looked like when it was mapped out, even from one day to the next. Construction sites are basically the same problem, how to continuously keep maps up-to-date and tolerate significant changes.

pot holes

I can't see any challenge with that, but afaik, that's not a top priority right now, more of a nice-to-have comfort feature that could be added to a more mature system. You can detect and map potholes, and then avoid them in the future.

fallen branches

Just stop? I don't think anyone is designing a self-driving car that can morph into a massive robot, like a transformer, to pick up the fallen trees and move them off the road.

As far as debris on the road, I think people are most concerned about stuff that falls off of cars or trucks in front of the car. Like, if some guy's muffler falls off in front of you on the highway, what is the car to do? But then again, I don't know that a human can do much in that situation either.

construction work

That's mostly about keeping maps up-to-date across the fleet and being tolerant of changes. This is often called the "lifelong mapping problem". This can be a tough problem to solve generally, but you can always just stop the car if things are too confusing, which would rarely happen in practice.

traffic lights are out

Not a problem. Detecting that lights are out is pretty much required if you have a system to detect the traffic lights to begin with. And the rules generally revert to a four-way stop sign, which the car needs to be able to handle that (and it's not worse than other situations, like unprotected left turns).

police are directing traffic

I don't know that this has been looked at or anything, but modern machine learning techniques are pretty good at detecting people, their intentions / attention, and body gestures. This might require some work to train that skill and deal with locale dependent police uniforms and all that. Again, a bit of a low priority at the moment, given how rare this is in practice. No technical hurdle there.

Considering that the electronic door locking/unlocking mechanism of one car my family owned would break whenever we had -20 weather or freezing rain, I'm a bit skeptical of self-driving cars where weather & roads aren't always ideal.

Consider that eletronic door locks are not designed to the same standards as the more critical systems. You will never see your ABS braking system or electronic stability system go bananas in cold weather. Car manufacturing is a very much constrained by costs (down the pennies on some parts) and parts are designed up to the standard they need to be, no more no less. Non-critical accessories will wear out after some time or fail in some conditions, which is calculated to be an acceptable inconvenience (given the probability of occurrence, inconvenience, cost of repairs, etc.).

One challenge with self-driving cars is that there are no standards that are strict enough to govern those systems, while also, the standards that do exist (e.g., ISO 26262) are too limited to allow a system take complete responsibility for the car's safety. For example, acceleration and steering is limited so to make it impossible for systems, like lane-keeping assist, to make "wild" maneuvers on the road, but if the system is ultimately responsible for the life of its passengers, it has to be able to make those wild maneuvers when that is the only course of action that saves lifes (e.g., an elk pops up on the road, you need to steer away violently in a way that far exceeds standard allowable limits).

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.