review of Kazuo Ishiguro’s novel Klara and the Sun

A thought-provoking story touching on a wide range of themes in a subdued way. (This review will be riddled with spoilers.)

Josie, a teenager, is chronically ill and possibly dying. Her parents have a secret plan to cope with this: they’re going to create an android replica of her. They’ll have an Artificial Friend, Klara, observe Josie’s behavior closely so that Klara will be able to mimic Josie perfectly. Then Klara’s AI will be transferred into the replica when Josie dies. The hope is not just to dull the parents’ pain, but to truly cheat death:

The new Josie won’t be an imitation. She really will be Josie. A continuation of Josie.

The thinking is that, if humans have nothing like a soul, then a good copy of a person just is that person in every sense that matters. I think that’s actually true—but collecting sufficiently precise and comprehensive information about someone to make a faithful copy would be quite a feat. Klara is up for the challenge:

Of course, a human heart is bound to be complex. But it must be limited. …there’ll be an end to what there is to learn.

She thinks it’s possible to gain thorough knowledge of Josie by careful observation. This seems wildly implausible. For one thing, Klara won’t be able to determine the exact configuration of all the neurons/etc in Josie’s brain just by watching her; but how Josie behaves presumably depends, sometimes, on the minute details of her brain. I suppose you might think that some such details are merely random influences on you rather than an essential part of who you are. But there are more substantive parts of ourselves we don’t often reveal in our behavior, too: internal dialogues we never vocalize, opinions we’re too kind or too afraid to say out loud, emotions we bottle up, memories we rarely mention. Many of these would be lost in the transition from Josie to the replica.

Still, with enough observational data, it should be possible to construct an AI that both thinks of itself as Josie and acts indistinguishably from Josie as far as her parents can detect; everything Josie’s parents know about her is ultimately based on observation too. Should they see such a replica as representing the full survival of their daughter? Surely not. Loving someone doesn’t just mean liking their personality and behavior; it means caring about them, including the parts of them that remain hidden from you. And those parts would die with the original Josie.

But death isn’t the only thing that can rob us of these hidden parts of ourselves. Consider someone suffering from amnesia, for example: their internal monologue has been interrupted, and what they know about their past self is only what other people can tell them. Most of us would agree amnesia is a lot better than death. This line of thinking draws me toward a weird conclusion: being replaced by a (conscious/sentient) AI model that’s been trained to impersonate you is more like suffering a brain injury than like dying. So Josie’s parents might even be acting in her best interests by creating the replica, providing her a genuine (albeit less-than-ideal) form of survival.


In the story, nobody gives much thought to the fact that becoming Josie would obliterate Klara’s own personality. Humans’ attitudes toward Artificial Friends like Klara seem inconsistent. Often Klara is treated as an object to be used and commanded; she is ultimately disposed of in a junkyard. But at other times she’s treated with tenderness and sympathy. This incongruity is reminiscent of how humans tend to treat pets, so it seems plausible that we might end up treating AIs the same way: as creatures that do have internal feelings and which we have some moral responsibilities toward, but whose feelings and interests matter much less than our own. This is an uncomfortable prospect, but at least it’s a little less appalling than the vision some dystopias depict, in which we’re utterly indifferent to the suffering of the beings we create. The analogy to animals suggests our empathy is likely to be extraordinarily arbitrary, though. Just as we tolerate cruelties to food animals which would outrage us if done to our pets, perhaps we will be concerned about the welfare of AIs that are attached to cute faces but not the welfare of ones doing behind-the-scenes work.


Josie’s illness is a side-effect of genetic editing meant to increase her intelligence. Parents in the book’s universe face a difficult choice. If they have this editing performed on their child, the child may die. But if they forego the process, the child will have severely restricted opportunities in life. The novel implies that a large portion of the population has been shut out of the world’s economy; many have lost their jobs due to “substitutions” (it’s unclear if this means replacement by genetically edited individuals, or replacement by AI). With each passing year, this feels to me less like a distant future worry and more like a pressing real-world concern. We need a way of organizing society so that there’s space for people who cannot do anything more efficiently or cheaply than a machine could do it. Sooner or later, that will be all people.


Josie does not die, because the Sun-focused religion Klara invents for herself yields a miraculous healing. The innocent, earnest faith that Klara invests in the Sun for flimsy reasons is uncomfortably realistic—uncomfortable because a crushing and humiliating disappointment was just as likely an outcome as the happy ending the book chooses to present.