Culpability

September 25, 2025

Culpability

by Bruce Holsinger

Mom loaned me this. A brief plot recap, absolutely packed with spoilers. Seriously, if you are reading this and have not read the book, you should STOP HERE. These notes are just for me.

Noah and Loralei are married with three kids, Charlie, Alice, and Izzy. They are in a bad car accident, and over the course of the book, various aspects of who was at fault are revealed:

  • The car was a SensTrek minivan with self-driving.
  • Charlie was in the driver’s seat, but he was using the self-driving mode. However, he was supposed to be paying attention in case he needed to manually override the self-driving mode. Instead, he was texting.
  • Alice saw him texting but didn’t want to tattle. But she did want him to get in trouble. She decided to scream something like, “Charlie, look out!” to get her dad to notice that Charlie was texting.
  • Noah, in the passenger seat, had his laptop out, writing something up for work. Since Charlie was 17, he was responsible for overseeing Charlie’s driving, but he was not paying attention.
  • When Alice screamed, Charlie grabbed the wheel and swerved, overriding the self-driving mode. He swerved into a car in the oncoming lane. The two people in that car, an older couple, were killed in the accident.
  • Izzy was the one who was texting Charlie (mean things about Alice), and she started it.
  • Lorelai, a PhD AI researcher and ethicist, was deeply involved in the development of the AI system that controls the minivan.

Most of that only comes out well into the book. So it’s a complicated circle of culpability.

After the accident, the family goes to a vacation house on some water. They are near a massive property owned by a tech billionaire, Daniel Monet. Charlie meets Monet’s daughter, Eurydice, and they instantly fall for each other. After a party there, she gets him to take Molly, and they sneak her sailboat out to go sailing at night. They are caught in a storm, she goes missing, he is injured. She is ultimately found and okay.

Another theme is survivor’s guilt. “Yes, that feeling. I know it already, like an exciting new friend.” (23)

Now some notes I jotted down while reading this:

“A family is like an algorithm… Like an algorithm, a family is endlessly complex yet adaptable and resilient.” (3) Why do people insist on using the word algorithm without learning what it means?

Here is an example of an algorithm (Bubble sort, in Python):

mylist = [64, 34, 25, 12, 22, 11, 90, 5]

n = len(mylist)
for i in range(n-1):
  for j in range(n-i-1):
    if mylist[j] > mylist[j+1]:
      mylist[j], mylist[j+1] = mylist[j+1], mylist[j] # swap values

print(mylist)

Does that look “endlessly complex”? It is not “adaptable.” You have to give it a list of numbers, or it won’t work. It is not “complex.” You go through the array looking at two values at a time, and if the second one is bigger, you swap them. That is an algorithm.

Anyway, kind of a pet peeve. But I also worry that the word is changing. Since sites like Facebook, Twitter, and YouTube refer to their method of determining what content to show you as “the algorithm,” maybe the word is just going to end up meaning “complex tech thing no one understands.” Bah.

The word luck/lucky is used many times (intentionally). Kind of becomes a sarcastic theme. Lucky kid. You guys are lucky. Good fortune.

The trolley problem comes from British philosopher Philippa Foot, 1967.

“Charlie has never had a girlfriend, or boyfriend for that matter.” Tossed off as if that wouldn’t matter to this dad. Doesn’t ring true. A little later, therapy and meds are a simple solution to some of Lorelai’s problems with OCD.

HELLOS drones have fewer accidental casualties than conventional assaults, 3% vs 10%. But the 3% are random errors. This reminds me of airplane automation.

Re: the butterfly effect and AI: “We all bear the unknowing burden of the butterfly, flapping our fragile wings in ignorance of what is to come.” (211)

After the boating accident, when they find Eurydice alive, I actually said out loud: “Good, I was about to hate this book.” (286)

I liked Lorelai’s speech to Daniel (293) – No matter how good the algorithm is, we can’t protect our kids from everything.

And Detective Morrisey’s speech (297) – AI is “making it impossible to hold anyone responsible for wrongdoing.”

Lorelai: “There’s a place for algorithms… But people have to be better, too.” This is naive, and she would know that. The idea behind self-driving has to be, “no human driving ever.” A hybrid where the human has to stay alert while not doing anything is stupid. That’s not a moral failing in people, it’s a failure to recognize how people are.

Noah’s role in the marriage: “I am scaffolding.” L calls him “the foundation.”

There’s a tender moment between Charlie and Lorelai toward the end, and Noah feels left out. “Hey, look at me! I’m at fault too!” – This is the only time I laughed out loud while reading this. Relateable.

Lorelai says the AI will be “exponentially smarter” than us. (318) It’s hard to write characters who are smarter than yourself. The author clearly did tons of research, but little lines of dialog like this can give you away. No self-respecting nerd would casually misuse the word “exponentially” to mean “a lot.”

Working for the Dept. of Defense is “moral compromise” and does not “prevent harm, save lives.” (320) Very simplistic. An ethicist like Lorelai can surely see how killing can save lives. This felt like a 90s movie or Avatar, preachy, anti-military.

“Life is not an algorithm and never will be.” (336)

Vocab

  • empathetic dysfunction - a debilitating response to an overload of impersonal but negative information (like world news)