|
Wordle 934 3/6
β¬π¨β¬π¨π¨
β¬π©π©π©π©
π©π©π©π©π©
|
|
|
|
|
Wordle 934 6/6
β¬β¬β¬π¨β¬
β¬β¬π¨β¬π¨
π¨π©β¬β¬β¬
π©π©β¬π©β¬
π©π©β¬π©π¨
π©π©π©π©π©
|
|
|
|
|
Wordle 934 2/6
β¬π¨β¬β¬π¨
π©π©π©π©π©
One of those days!
|
|
|
|
|
Wordle 934 3/6
β¬β¬β¬π¨β¬
π¨π¨β¬π©π©
π©π©π©π©π©
βThat which can be asserted without evidence, can be dismissed without evidence.β
β Christopher Hitchens
|
|
|
|
|
β¬π¨β¬π¨π¨
β¬β¬π¨π¨π¨
π¨β¬π©π©π©
β¬π©π©π©π©
π©π©π©π©π©
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
Wordle 934 5/6
π¨β¬π¨π¨β¬
β¬π©π©π©π©
β¬π©π©π©π©
β¬π©π©π©π©
π©π©π©π©π©
Gah!
|
|
|
|
|
I only just made it in 5
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
I won't tell if you don't.
Jeremy Falcon
|
|
|
|
|
Wordle 934 6/6*
β¬β¬π¨β¬π¨
β¬β¬β¬π¨π¨
β¬π¨β¬π©β¬
π©π©β¬π©π©
π©π©β¬π©π©
π©π©π©π©π©
Happens to all of us
Happiness will never come to those who fail to appreciate what they already have. -Anon
And those who were seen dancing were thought to be insane by those who could not hear the music. -Frederick Nietzsche
|
|
|
|
|
Quote: Wordle 934 3/6
β¬π¨β¬π¨π¨
β¬π¨β¬π©π¨
π©π©π©π©π©
Ok, I have had my coffee, so you can all come out now!
|
|
|
|
|
Wordle 934 5/6
β¬π¨β¬π¨β¬
β¬β¬π¨π©π©
β¬π©π¨π©π©
π©π©β¬π©π©
π©π©π©π©π©
Jeremy Falcon
|
|
|
|
|
How long before self driving cars claims stop?
I am talking about the claim of replacing all cars and not about autonomous vehicles driving around a warehouse.
Consider this as a scenario in 2021 42,000+ people died in automobile accidents in the US. About 1,000 were children. Notice that injuries are a lot higher.
So lets say self driving cars worked and so deaths dropped by two orders of magnitude. So 420 people and 10 children.
Now in any modern accident in the vast majority of cases a driver is found to be at fault. Drunk, texting, distracted, reckless, etc.
So in the above with a self driving care no person can be at fault. Because they were not driving.
Now in some of those cases, especially with children, someone is going to blame the car. Not the specific car, but the manufacturer of the car.
And then they will sue them for 10 million. Or 100 million.
Consider that just in past week a door (sort of) blew off an airplane and all planes of that type were grounded.
Is the government going ground a couple million cars? Might even be possible with self driving, just send a signal.
One of self driving car companies is likely going out of business because their car drove to the side of the road with a pedestrian underneath.
Now if a person had been driving the driver presumably would have been at fault - if anyone could have determined the correct behavior in that bizarre case. Seems like slamming on the brakes, in the middle of a highway, might not be the best action. So what is right? Who gets to decide that?
And even if the action was exactly right, is a lawsuit against the company still going to happen?
|
|
|
|
|
I think most programmers would never get in a self driving car. We all know how buggy our own code is.
Heh, just thought buggy Buggy code.
Iβve given up trying to be calm. However, I am open to feeling slightly less agitated.
Iβm begging you for the benefit of everyone, donβt be STUPID.
|
|
|
|
|
MarkTJohnson wrote: most programmers would never get in a self driving car.
20 CEOs of avionics companies were all on a plane, just before taxiing away from the terminal. The purser came down the aisle, and whispered to each CEO that their company's avionics are controlling the plane. 19 out of 20 CEOs got off the plane immediately, while the 20th stayed in place.
The purser said that he/she/it must be very confident in the company's avionics, to which the answer was "with the programmers we employ, the plane won't even take off!"
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
I was expecting a punchline that takes a shot at CEO's. I wasn't expecting a dig at programmers.
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
Buggy2
Software Zen: delete this;
|
|
|
|
|
I agree that automatic driving has the potential to drastically reduce the number of car accidents. However, given the litigious climate in the U.S. (and increasingly - in the rest of the world), I doubt whether any car manufacturer will actually advertise "automatic driving" as a feature.
The only way that I see this happening is that car manufacturers be required to submit their cars for rigorous external tests, in return for receiving legal indemnity from lawsuits. Something similar exists for vaccines.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Daniel Pfeffer wrote: The only way that I see this happening is that car manufacturers be required to submit their cars for rigorous external tests
I'm pretty sure the NHTSA is already responsible for that. For testing the self-driving features? Probably not so much. I don't really see a government agency keeping up.
Daniel Pfeffer wrote: , in return for receiving legal indemnity from lawsuits. Something similar exists for vaccines.
Reagan indemnified pharmaceuticals back in the 80s. Sure, there's plenty of testing going on, but holding Big Pharma accountable should be a thing.
[Edit]
Worse, it's actually called the National Childhood Vaccine Injury Act of 1986. Can't sue for harming your kids with a bad vaccine. That sounds so wrong...
|
|
|
|
|
The problem is not the car or the software, it's the environment.
My understanding is that the auto-drive vehicles continuously scan the environment, looking at the road, lines on the road, signs, other vehicles, etc. This works great if everything is marked well and signs exist.
However, if the road is unmarked (fresh pavement), the lines are badly worn, or there are old-n-new lines, things get dicey.
Recently I drove through construction where there were three sets of lines, two looked fresh, and all ended abruptly at different places. *I* had a problem figuring out where my lane was. How is software going to do it?
Self-driving vehicles will continue until someone connected to money gets seriously injured or killed. THEN the house of cards will come crashing down.
|
|
|
|
|
|
These safety problems are addressed in autonomous driving cars.
I am working in the automotive industry, and I can assure you that I would buy an (electrical) AD vehicle from any European or Japanese car manufacturer without any discussion.
I still have my own doubts regarding quality of EVs from China. And do not get me started about the Musk company - nice toy, best car to die in.
|
|
|
|
|
Rage wrote: These safety problems are addressed in autonomous driving cars.
Not sure I understand your statement.
For example self driving cars are not going to prevent heated seats from catching on fire. That is one of the recalls.
And Telsa has a recall in effect to reduce the ability of their cars to self drive. So very specific to self driving.
|
|
|
|
|
I don't see how any of that leads to OP's claim that self-driving cars will never happen. Recalls happen. All the time. My dad was a mechanic for over 40 years, and recalls have provided plenty of work, even for the silliest things. Buggy self-driving software? That's an over-the-air update, I don't see that as a big deal.
I suppose retro-fitting an existing car with new sensors would be something else. But then, if there was a need for that, the manufacturers would just take the feature away and claim it was never sold as "fully self-driving" anyway.
|
|
|
|
|
Just a small note on the self driving car that dragged the person when it attempted to pull off to the side of the road. The person was first hit by a car driven by a human driver. Their body was thrown in front of the self driving car. The self driving car could not stop. (Physics). The human driver ran from the accident and is still being looked for.
Unfortunately, the person fell into a spot that was outside the range of the car sensors. The car proceeded to try and pull over to wait for help with the accident and made things worse by running them over.
Yes, a human being would get out of the car and look to aid the injured person before trying to move their car (unless they panicked and simply drove away.)
Yes, the self driving car needs to have it's software upgraded to include the case where a body is thrown in front of it, collides with said body, and can not locate the body after hitting it. In that case it needs to simply stop, call 911, and wait for human assistance.
Yes, There are many unique things that a car can encounter. Will software ever be up to the challenge? I honestly don't know. But I do know that the current carnage on our highways will continue, with or without automated driving help. Software can be upgraded, people; not so much.
|
|
|
|
|
Gary Stachelski 2021 wrote: Yes, a human being would get out of the car and look to aid the injured person before trying to move their car (unless they panicked and simply drove away.)
Not sure I agree with that.
Not even sure I agree that that is the best action to take.
As I noted in that was a highway. And at night.
Not sure about you but for me slamming on the brakes at any time on a highway is not something that I consider safe. Not for me and not for the cars behind me.
Also as a driver I have been in an accident where I had no idea what had happened. Also on a highway. So the 'correct' behavior becomes much less clear.
Gary Stachelski 2021 wrote: the self driving car needs to have it's software upgraded to include the case where a body is thrown in front of it,
For a driverless vehicle that means programming every possible scenario. That is just not going to happen.
Some examples.
I have been on a higher speed road and the car in front of me hit a bumper that fell off another car and it launched the bumper into my car.
I have seen a car that was side swiped (literally knocked off the road) because it came from behind and speed up in a turn lane besides a long line of stopped cars and one of the cars in the stopped line decided to change lanes abruptly. I saw the car speed up because I was further down the line of cars. Not sure it was even visible to the car that changed lanes.
I have seen a bicyclist going the wrong way down a one way street at night with no lights and moving quickly. I actually know that person and he had previously been in a accident doing the exactly the same thing except that time he was hit and went flying over the car.
Note that in these scenarios it is not only that the car must be programmed to handle it but that the car maker must be able to show that what it did was the correct and best way to handle it.
Gary Stachelski 2021 wrote: Will software ever be up to the challenge? I honestly don't know.
That however is the point. When those accidents do occur the car maker will be sued for large amounts of money.
|
|
|
|