NPR News, Classical and Music of the Delta
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Writer Ted Chiang on AI and grappling with big ideas

Ted Chiang was recently awarded the PEN/Faulkner Foundation's prize for short story excellence.
Alan Berner
Ted Chiang was recently awarded the PEN/Faulkner Foundation's prize for short story excellence.

Science fiction author Ted Chiang wishes he could write faster.

His entire body of work from the last 34 years almost completely fits into two book-length collections of short stories, and he says he feels the pressure that many writers do — to be more prolific.

"I can't claim any moral high ground or deliberate strategy. It's mostly just that I'm just a very slow writer," Chiang said.

But each of his stories is meticulously crafted, the result of big philosophical questions that gnaw at him for months or even years. And he is no stranger to success: His novella-length "Story of Your Life" was the basis of the film Arrival. Many of his works have won science fiction's highest accolades and prizes.

Chiang recently added another prestigious award to that list. He is the recipient of this year's PEN/Malamud Award, which celebrates "excellence in the short story."

Chiang sat down with All Things Considered host Scott Detrow to talk about his writing process, the philosophical ideas that undergird science fiction and why he doesn't think AI is capable of making art.

This interview has been edited for length and clarity.

Ted Chiang's "Story of Your Life" was the basis of the film Arrival.
Penguin Random House /
Ted Chiang's "Story of Your Life" was the basis of the film Arrival.


Interview highlights

Scott Detrow: I want to start really broadly because I think so many of your stories seem to be asking big questions, whether it's how humans would behave when they encounter a disruptive new technology, or an alien race, or the physical presence of God. But then all the stories come back to the human reaction to that, as opposed to the existential problem itself. When you're coming up with these stories, do you start with the big question? Do you start with the character? Where does your mind typically drift first?

Ted Chiang: I usually start with what you would call "the big question." I am interested in philosophical questions, but I think that thought experiments are often very abstract, and it can be somewhat hard for people to engage with them. What science fiction is good at is, it offers a way to dramatize thought experiments. The way it happens for me is that ideas come and ideas go. But when an idea keeps recurring to me over a period of time, months or sometimes years, that is an indicator to me that I should pay more attention to this idea, that this idea is gnawing at me. The only way for me to really get it to stop gnawing at me is to write a story.

Detrow: You've published a series of articles in The New Yorker taking a critical look at AI and often making arguments that this is being framed the wrong way when popular culture talks about artificial intelligence [or] large language models like ChatGPT. What is it about AI in this moment that interests you?

Chiang: As a science fiction writer, I've always had a certain interest in artificial intelligence. But as someone who studied computer science in college, I've always been acutely aware of the vast chasm between science-fictional depictions of AI and the reality of AI. I think the companies who are trying to sell you AI benefit from blurring this distinction. They want you to think that they are selling a kind of science-fictional vision of your superhelpful robot butler. But the technology they have is so radically unlike what science fiction has traditionally depicted.

Detrow: In one of these essays that I think perhaps got the most attention, you were making the argument that AI is not going to be making great art. Can you walk us through your thinking, your argument about the fact that ChatGPT probably isn't going to write a great novel or DALL-E is not going to be creating really valuable fundamental works of art?

Chiang: So the premise of generative AI is that you, as the user, expend very little effort, and then you get a high-quality output. You might enter a short prompt, and then you get a long piece of text, like a short story or maybe a novel. Or you enter a short prompt, and then you get a highly detailed image, like a painting. You cannot specify a lot in a short text prompt. An artist needs to have control of every aspect of a painting. A writer needs to have control over every sentence in a novel. And you simply cannot have control over every sentence in a novel if all you gave was a pretty short text prompt.

Detrow: Tying this back to your fictional work, I think a lot of your stories will propose a new innovation or a scientific discovery that just rocks the society that it comes upon. Is it fair to say that, at least when we're talking about generative AI, when we're talking about AI in the current conversation, is it fair to say that you do not see it as that kind of game-changing development?

Chiang: I think that generative AI will have massive repercussions, not because it is fundamentally a transformative tool, but because companies will be quick to adopt it as a way of cutting costs. And by the time they realize that it is not actually that effective, they may have destroyed entire industries. But in the meantime, they might have made a lot of short-term money. And it costs thousands or millions of people their jobs.

Detrow: There are these big societal changes in your pieces. But in a lot of the stories, the main character won't necessarily change that much of their identity. Whatever massive shift is happening seems just kind of to confirm their sense of purpose or their sense of identity. I'm wondering how you think about that, and if you think that's maybe a hopeful takeaway from some of these stories.

Chiang: So, I would say that big technological changes, they often will demand that we kind of rethink a lot of things, but they don't automatically change our fundamental values. If you loved your children before, you should continue to love your children — there's no technological advance that will make you think, "Oh, actually, loving my children, I guess I'm going to discard that idea." So, I wouldn't say that the characters are unaffected or that they just go on being the same. It's more that they hopefully find some way to live, which allows them to be faithful to their core beliefs, their core values, even in the face of a world that has changed in a very unexpected way.

Copyright 2024 NPR

Scott Detrow is a White House correspondent for NPR and co-hosts the NPR Politics Podcast.