How will we program our driverless cars to react in situations where there is no choice to avoid harming someone? ... Do you want your car to kill you (by hitting a tree at 65mph) instead of hitting and killing someone else? No? How many people would it take before you'd want your car to sacrifice you instead?
How we see the world and right and wrong is a collection of nice round holes; the trolley problem is a big square peg hovering menacingly above them.