Tag Archives: Bruce Holsinger

Review: Culpability

Culpability
By Bruce Holsinger
Spiegel & Grau, July 2025, 380pp.

The Short of It:

Tech advances, taken to any extreme cause irreparable damage.

The Rest of It:

When the Cassidy-Shaws’ autonomous minivan collides with an oncoming car, seventeen-year-old Charlie is in the driver’s seat, with his father, Noah, riding shotgun. In the back seat, tweens Alice and Izzy are on their phones, while their mother, Lorelei, a world leader in the field of artificial intelligence, is absorbed in her work. Yet each family member harbors a secret, implicating them all in the tragic accident.~ the publisher

I’ve been looking forward to reading this one for awhile. To begin, let’s look at the definition of culpability:

responsibility for a fault or wrong; blame.

Really? Going in, I considered that title while I was reading and was left scratching my head. Why? Because I really didn’t feel that any of these characters felt the weight of their mistakes.

Autonomous vehicles. This one was fully occupied by a family who chose the let the vehicle drive itself. I see autonomous vehicles like Waymo out and about in Century City here, and I’ve observed them in traffic and they are pretty responsive. Probably more so than say a kid on their phone.

There you have it, the conflict. Seventeen year-old Charlie was at the wheel when a car turned into their lane. His response was hampered by the fact that he was actively texting at this critical moment. With his mom and sister behind him, and his dad to his right, you’d think that focus would be on the road but every person in that car was occupied. It was in autonomous mode, after all. Are you to blame when in autonomous mode?

As the Cassidy-Shaws recover from their injuries, they decide to do so by renting a house by the water. A little R&R while they ride out the investigation. At first, none of them are all that concerned and honestly, this bugged me. If this was an error in judgement, or a malfunction with the autonomous technology, it still doesn’t change the fact that people were permanently impacted by this accident.

As a mom, Lorelei seems very distracted and not terribly bothered by any of it. To her, it’s not even a possibility that the technology played a role. She helped develop it and pushed her family to adopt one of these vehicles. But sure, she doesn’t want her kid to face jail time for a simple mistake.

Charlie’s father Noah, a lawyer is much more bothered by the investigation because he knows how these things play out. Culpability, accountability, and yes, money. Money is actually at the forefront to all of it.

While staying at this rental house, he notices the behemoth property across the way. Rich people. Security. Helicopters flying in and out. Business mogul Daniel Monet zips in and out with his beautiful daughter on his arm and then, in a matter of minutes, Charlie is going out with Monet’s teenaged daughter.

I was really surprised at the decision to go this direction. Merging the lifestyles of the rich and not-so famous without fully fleshing them out, was a risk and one that didn’t work for me. I suppose the author was making a statement that money can buy lots of things and that even poor choices can be made better by throwing some money at them. I didn’t care for this aspect of the story at all.

Turns out that Monet and his shallow offerings actually have a lot more to do with the Cassidy-Shaws than you think. A thin thread, at that. Again, not fully developed or explored. There is no accountability for their actions in this story.

What a missed opportunity to explore the ramifications of AI in society. I work in higher ed. AI is all the rage. I possess three Micro-Certs in AI adoption for higher ed and there are definitely roadblocks and hazards to consider, some of which include environmental impact. Did you know that for every prompt it takes six bottles of water to cool the data center holding all that information?

In education, AI use can be a benefit or a curse. For research it’s handy but the data integrity can get muddy when it’s using data from multiple sources. AI also gets confused. It’s called Hallucinations. This is where gibberish is the outcome.

Today, there is legitimate concern that AI is replacing humans. Look at the kiosks at McDonald’s. Newer restaurants no longer have the option of a person taking your order. It’s a dicey gray area and needs to be pursued with caution. Culpability had the chance to fully explore that realm and honestly, chose not to.

Source: Borrowed
Disclosure: This post contains Bookshop.org affiliate links.