Quotable (#194)

Ben Narasin cites an (unnamed) Chinese Communist Party official on the techno-economic function of science fiction:

For years we’ve been making wonderful things. We make your iPods. We make phones. We make them better than anybody else, but we don’t come up with any of these ideas. So we went on a tour of America talking to people at Microsoft, at Google, at Apple, and we asked them a lot of questions about themselves, just the people working there. And we discovered they all read science fiction … so we think maybe it’s a good thing.

Quotable (#170)

Sunspring (see linked video):

Ars is excited to be hosting this online debut of Sunspring, a short science fiction film that’s not entirely what it seems. It’s about three people living in a weird future, possibly on a space station, probably in a love triangle. You know it’s the future because H (played with neurotic gravity by Silicon Valley‘s Thomas Middleditch) is wearing a shiny gold jacket, H2 (Elisabeth Gray) is playing with computers, and C (Humphrey Ker) announces that he has to “go to the skull” before sticking his face into a bunch of green lights. It sounds like your typical sci-fi B-movie, complete with an incoherent plot. Except Sunspring isn’t the product of Hollywood hacks — it was written entirely by an AI. To be specific, it was authored by a recurrent neural network called long short-term memory, or LSTM for short. At least, that’s what we’d call it. The AI named itself Benjamin.

Cosmic Sociology

From the prologue to Cixin Liu’s The Dark Forest (follow up to The Three-Body Problem):

“See how the stars are points? The factors of chaos and randomness in the complex makeups of every civilized society in the universe get filtered out by distance, so those civilizations can act as reference points that are relatively easy to manipulate mathematically.”
“But there’s nothing concrete to study in your cosmic sociology, Dr. Ye. Surveys and experiments aren’t really possible.”
“That means your ultimate result will be purely theoretical. Like Euclid’s geometry, you’ll set up a few simple axioms at first, then derive an overall theoretic system using those axioms as a foundation.”
“It’s all fascinating, but what would the axioms of cosmic sociology be?”
“First: Survuival is the primary need of civilization. Second: Civilization continuously grows and expands, but the total matter in the universe remains constant.”

“Those two axioms are solid enough from a sociological perspective … but you rattled them off so quickly, like you’d already worked them out,” Luo Ji said, a little surprised.
“I’ve been thinking about this for most of my life, but I’ve never spoken about it with anyone before. I don’t know why, really. … One more thing: To derive a basic picture of cosmic sociology from these two axioms, you need two other important concepts: chains of suspicion, and the technological explosion.”
(pp. 13, 14)

Quotable (#68)

John Sundman’s Acts of the Apostles reviewed at Salon (2001):

… it’s hard to argue with the central thesis of “Acts of the Apostles,” which is that advances in computer technology and biotechnology are proceeding so quickly that we are speedily approaching the day when scientists and programmers are able to design machines that can alter our genetic structure and reshape our brains. And what is the engine of this change? Why, capitalism, of course. In particular, Silicon Valley-style capitalism — the relentless search for products that can generate vast revenue through innovations in high technology.

In Sundman’s view, this is a progress that can’t be stopped. Ethicists can’t stop it, governments can’t stop it, and even the band of heroes in “Acts of the Apostles” is essentially powerless. They can deflect it, but not derail it. His horror at the future echoes Sun co-founder Bill Joy’s warning about technological progress. But whereas Joy argues that the dangers of technological progress call out for restraint and/or government intervention, Sundman, at least as far as his novel is concerned, seems convinced that little can be done to stop it. The capitalist imperative is too strong. Even if you stop one megalomaniac software czar, a hundred more will jump to take his or her place.