Who, What, and How We Represent Matters.
Values-driven personal injury lawyers in Austin, Texas here to listen, educate, and help in any way we can.
Contact Us

Joe vs. The Volcano vs. Self-Driving Cars

Josh and Aaron discuss what the future of self-driving cars might look like for personal injury law, and what we should be thinking about.

Listen here or read the transcript below. FVF’s Summary Judgment podcast is available wherever you listen to podcasts including Apple Podcasts, Spotify, iHeart Radio, and more.

0:00:00.0 Josh Fogelman: Thank you for tuning into Summary Judgment, where Austin Personal Injury attorneys, Josh Fogelman and Aaron von Flatern of FVF law discuss the ins outs and in-betweens of personal injury cases.

[music]

0:00:19.0 JF: Hey, Aaron.

0:00:19.1 Aaron Von Flatern: Hey, Josh. That’s my line.

[laughter]

0:00:21.5 JF: Stole it.

0:00:23.1 AF: All right.

0:00:24.7 JF: It’s public domain.

0:00:24.9 AF: Sue you later.

0:00:25.8 JF: Bro. Have you seen the movie Joe Versus the Volcano.

0:00:29.3 AF: No less than 20 times.

0:00:32.0 JF: That was the first crush I ever had was Meg Ryan and Joe Versus the Volcano.

0:00:39.4 AF: All of her characters.

0:00:40.0 JF: That’s fair.

0:00:40.9 AF: I was in love with all three of them.

0:00:41.8 JF: Yeah. No, she pretty cool. How does Joe Versus the Volcano relate to self-driving cars?

0:00:52.4 AF: You don’t see it.

[laughter]

0:00:56.4 JF: I heard through someone that we both know that you can make the connection.

0:01:02.1 AF: Okay. Let me lay it out there. Okay, so let’s go… First of all, get in the time machine. We’re going forward.

0:01:10.6 JF: Yes. Do you wanna get in my time machine or your time machine?

0:01:14.3 AF: Get in mine.

0:01:15.4 JF: Okay. Yours is nicer.

0:01:17.5 AF: I’m driving.

0:01:18.4 JF: Yeah, yours is nicer.

0:01:18.8 AF: We’re…

[laughter]

0:01:19.5 JF: You’re not a better driver, but your time machine is definitely nicer.

0:01:21.6 AF: I’m not as cheap as you so enjoy the air cooled seats in our time machine, and we’re going about 15 years forward. All right. At this time, all of the cars are autonomous and so essentially they’re robots using a combination of AI, machine learning to figure out the roadways. They understand the difference between a dog and a construction cone and a person, and they’re able to attain like high speeds. And car ownership has probably changed. We’re in a place now where you don’t really own a car. There’s a stream of robots that go by your house and you just jump in whenever you want. It goes wherever you want to go. You get in and outta the stream of traffic all the time. Nobody even has garages anymore. This is a really interesting…

0:02:20.0 JF: Yeah. Sounds great.

0:02:20.9 AF: Future for us.

0:02:21.2 JF: Sounds great.

0:02:23.1 AF: One of the things that is celebrated in the news 15 years forward is the fact that there is a dramatic decrease in the number of injuries and deaths by car accident. Would you agree that that is a probability if that was true?

0:02:43.6 JF: Yeah, I would think that cars in theory statistically, will make less errors than humans because they’re not distracted and they’re not having sudden emergencies and there’s a whole slew of just kind of human conduct that’s removed from the equation. And you would think that overall… You see it now with just the advent of safety features like pre-collision detection systems and things of that nature that are avoiding crashes already. So yeah, it makes sense.

0:03:17.3 AF: Robots are dumb, but humans are dumber. We’re eating cheeseburgers and checking our phone and we’re doing dumb stuff on the road right now, right this second. So 15 years, hence robots are doing it for us and they’re still making some mistakes. And this is what I wanna talk about is you’ve got some people say there’s a death. How do we look at that as a society? Because the way I see it is, in Joe Versus the Volcano, the tribe is wanting to sacrifice someone. They’re like, let’s take a person up to the volcano, toss them in, in order to assure our society’s protection and providence because this is what’s going to please the gods. We’ve gotta throw someone in. And that’s kind of the decision we’re faced with as a society moving to robot vehicles.

0:04:16.3 AF: Yes, we know 100% for sure that the robots are going to go rogue and kill somebody, some mistake, some programming error. What was working once when it was first rolled off the line is now working a little bit wonky 10 years later, and the robot ends up just causing a death. How do we look at that? Did we just throw that person into the volcano because we the rest of us want to be in a society where there’s just less car crashes, less injuries, less deaths. And if that’s all true, what do we think about that sacrifice? When that family hires a lawyer to sue the manufacturer and they’re showing up in court and we’re in front of the jury and we’re asking the jury for an amount of money, it’s a little different. You’re not asking for the value of human life. You’re not asking for the value of the relationships between the people, all the grief and harm. You’re asking for… I mean, you’re asking for that, but you’re asking for so much more than that. You’re asking for the jury to recognize what is the price that we’re going to put on this? We’ve made this very interesting choice, what should we do? In my opinion, we all have a vested interest in the price tag being extremely high. I’m talking something with like a B in front of it for that death. How else are you going to keep the manufacturers honest?

0:05:56.5 JF: Exactly.

0:05:58.9 AF: And then also, how else are you going to recognize the fact that there are probably billions less in injuries and deaths out there because of this technology? It’s like, “Hey, we’ve all saved, there’s less property damage, there’s less lost earnings, there’s less medical bills, there’s less funeral expenses, there’s less harm emotionally to these families. Are we not willing to transfer those savings collectively as a society over to that one family who had to sacrifice, who lost their father, child, mother, wife, daughter? I think the answer should be yes.

0:06:40.3 JF: Yeah. It’s kind of an interesting moral quandary. Where do we draw the line on when is it safe enough when you know that the robot’s not perfect and the robot will never be perfect, but as a whole the robots will be better than the people. And you’re just sort of shifting maybe the focus of who’s going to be the victim. Although right now it can be pretty well randomized. You never know if you’re going to be the person that’s t-boned by the distracted driver or the drunk driver, but it certainly presents an interesting moral predicament about sort of where we are now, what our society is allowing, how different communities are establishing different rules around self-driving cars that you got pilot cities that are allowing the self-driving cars to go around and pick up people and deliver them places and some places where that’s not allowed.

0:07:53.0 JF: I remember my father-in-law is a big car guy. He writes about cars. He writes a lot specifically about electric cars now, and he is become sort of an electric car expert. He’s over in Germany and he has a blog about them. And he was visiting once a couple of years ago. And he wanted to drive a Tesla, he gets press cars, he can go drive cars from all different manufacturers all the time, but Tesla and Germany wouldn’t let the press, or at least his level of the press, kind of a hobby journalist access one of their cars. So he came over to America and I arranged to go and test drive, I think it was a Model Y. And this Model Y had the full self-driving capability. And we were in the car and I was actually driving at this time, but he was in the car with me and drove later.

0:08:48.9 JF: But I was driving and the Tesla representative’s in the back of the car with me and he’s like, let’s test the the auto lane change feature. I’m like, “Okay, that’s fine. We’re on MoPac at the domain. I engage it, put on my turn signal and I’m watching of course. And I got my hands on the wheel like you’re supposed to. And this thing starts to change the lane. It’s totally clear there’s no car next to us. And it starts to change lanes. It gets about halfway into the lane and then jerks itself back into the center lane, like very aggressively. And it was very terrifying and alarming even the person, the Tesla rep in the back was like, “Oh my, it’s never done that before type situation. And you kind of realize like, look, these things are out on the roads. People are already using and abusing the self-driving car privilege. You can buy weights on Amazon that you can attach to the steering wheel to trick your car into believing that your hands are on the wheel while you have it engaged in self-driving mode.

0:10:08.5 AF: That’s bold.

0:10:10.2 JF: It’s a thing. It’s a thing that’s real in our society now. So we’re already starting to kind of see and there are times when these vehicles crash.

0:10:24.6 AF: We have one of those cases.

0:10:28.4 JF: It’s real.

0:10:31.4 AF: Well, the story of Tesla releasing first their sort of autopilot mode and then gradually releasing full self driving mode for a charge or for an extra price additional cost to the drivers. And then having a lot of stuff go wrong and then getting in trouble with the federal government. And then finally just recently coming out and recalling 2 million vehicles with full self driving mode, and that happening at about the same time one of our clients got seriously hurt. And I won’t go into the details too much ’cause this case is still open, when a Tesla vehicle suddenly turned left across traffic, moving at pretty good roadway speed out there. I don’t remember the speed limit on this case, but causing severe injuries and it’s also causing a lot of PTSD because when we have car accidents, I think people experience a little PTSD afterwards.

0:11:43.4 AF: But when they realize, okay, look this is a human, humans make mistakes generally, I can trust humans, they get back to where they can drive comfortably and they’re kind of getting out of that PTSD framework. When it’s a bolt out of the blue, random missile. We find people’s PTSD last longer. A great example is drunk drivers. People don’t get over PTSD as fast with drunk driving crashes because of their understanding that they were essentially attacked by a random missile. Not like a reliable, rational human who just made a mistake. And…

0:12:19.2 JF: Yeah. Most reliable, rational human drivers aren’t driving on the wrong side of the road with their headlights off that give you zero chance of evading a collision. It’s the surprise element there.

0:12:31.5 AF: Yeah. The surprise and the fact that there are the knowledge for 100% sure there are more random missiles out there. That is what erodes your confidence and keeps you in that PTSD framework where you’re afraid to drive, and I don’t take the term PTSD lightly. I know people are going to hear that and think, well, that’s only for first responders and soldiers. And the fact is that PTSD can be measured. We’ve hired the world’s leading PTSD expert at Harvard for some of our cases, and we’ve seen it. We’ve seen this as a real debilitating problem for people. And I think it’s going to get worse if you have robot cars causing the crashes.

0:13:14.6 JF: It’s going to get worse before it gets better for sure. Because there’s going to be this weird transition period. So there’s going to be a transition period like there is in any sort of like technological revolution. And it kind of puts us in this weird quandary now, because the way that our civil justice system is set up now, we as a society demand compensation in terms of money, monetary damages. When someone is careless and hurts you, that’s the only way that we know how to try to make things right. And a part of that is accountability. Holding the wrongdoer accountable for their wrongful conduct. But when you have this kind of weird situation where a person is allowed to buy an instrumentality, a vehicle, a car that purports to be able to drive itself with the caveat that you are responsible for maintaining control of that vehicle while your hands are… By maintaining your hands on the wheel, well, who’s at fault if the vehicle decides on its own in a split second to do something reckless while you’re supposed to be in control of it?

0:14:55.5 JF: It creates a complicated question. Who’s supposed to be accountable? Well, the person who’s got their hands on the wheel and is in charge of that vehicle, they obviously hold some degree of responsibility if they’re not doing what they’re supposed to be doing with regard to maintaining control of that vehicle. But you also have to have some accountability to your point earlier on the company that put that vehicle into the hands of the driver and said, this is safe. You can use our product, you can use this service that we’re providing you, it’s safe to do.

0:15:37.7 AF: Yeah. I think speaking out of both sides of your mouth to say, Hey, pay me extra to get this feature where the car will drive itself. Oh, but by the way, if anything happens it’s your fault because you’re supposed to be driving it. If you have to have your hands on the wheel and your foot hovering over the brake, are you getting any benefit whatsoever? It’s an illusory benefit. You’re not getting any benefit from full self-driving if that’s how it is. And one of the ways that Tesla appease to the people who are paying money for this, is to not turn full self-driving mode off the second you take your hands off, it waits a second and eventually tells you to put your hands back on the wheel. And so there are some ways you can tell Tesla is… They’ve got a culture of asking for forgiveness, not permission. And here, they’re out there… They have a huge incentive to get the data that comes from people letting the car drive itself. They want to know what happens. And that experiment is happening in real time, and it’s fed to Tesla constantly. And our client in this particular case has been thrown into the volcano, in my opinion. And frankly we’re not going to compromise very much on this case.

0:17:07.3 JF: Yeah. No. And I think our responsibility as a society, as users of the road who have our families in our cars, on the same roads as these other vehicles, I think we have the responsibility to ask the hard questions and hold accountable those who are responsible for creating these types of dangers that we are all going to to be dealing with for the foreseeable future in there on I think it’s going to be really interesting to see how this technology unfolds. It’s going to be interesting to see how the state, individual state, maybe even local governments, state, local and federal governments sort of respond to this, not just in regulating it. When’s enough, enough? When has enough damage been caused? When do we cross a threshold if ever, of understanding that the danger is too great to allow us all to be potential candidates to be thrown into the volcano. If we never crossed that threshold, then what kind of legal standard do we apply to these companies to hold them accountable for the experiments that they’re running, sort of at our expenses, at our expense?

0:18:42.0 AF: Do you know the last form of government is truly of by and for the people.

0:18:48.9 JF: Jury trials.

0:18:49.0 AF: Jury trials. I mean, that’s the government that’s going to come in and do something. And what I’m curious about is how will juries view the families who are sacrificing for the rest of us.

0:19:00.7 JF: Sure. And yeah, that’s always one of the interesting things about jury trials in these sort of David versus Goliath cases. It’s like there’s always more to the story. What do those internal memorandum and emails say? What did the company know when they were convincing the government to let them run this experiment? So sure…

0:19:26.2 AF: I can’t wait. I honestly cannot wait to read what I assume is going to be millions of pages of discovery in this case.

0:19:33.7 JF: It’s going to be interesting. Beyond this case it’s just it’s an interesting time for this revolution, not just to see how it unfolds and the benefits it might be able to provide to society. And maybe long term we get there. Hopefully we do. We’re not there now, but maybe sometime we get there, but also to see how we as a society respond to that. So going to be interesting. This is certainly one to have some fun with.

0:20:00.0 AF: All right. Well, thanks for getting in the car.

0:20:04.5 JF: It’s always a risk with you, man.