GPT-2 tries to imitate Yoon Ha Lee

In 2021, @telophase trained GPT-2 on the complete text of Yoon Ha Lee’s Machineries of Empire novels. (With Yoon’s permission.) @telophase then posted some of the (often entertaining) results.

I think this one is my favorite:

Inside, there was a very fine collection of paper and styluses, a very large map, and a very good whiskey. The only thing that stopped him from passing out was the perfume-soaked floor and the dripping floor and the rippling floor. He smelled of odd juniper, and the stinging mint. And then there were the servitors. And then there were the servitors. And then there were the servitors.

Clarifying note: All of the bits in the post were, as I understand it, generated by GPT-2. Some of them (in whole or in part) happen to also be direct quotes from the original text, because GPT-2’s output sometimes includes verbatim phrases from the training data. (For example, the phrase “good whiskey” appears in the books, but not the phrase “and a very good whiskey.”)

I mention that because @telophase framed the post as a quiz, asking which bits were generated and which bits were verbatim quotes; and I initially misinterpreted that as indicating that she had copied some lines from the original text and manually included them in the list. But that turns out not to be what was going on.

Also, she doesn’t give answers to the quiz. And there are only about two of these bits that I can imagine might be entirely verbatim quotes from the original anyway. So I recommend just ignoring the quiz framing; for me, that detracted from the fun of enjoying what GPT-2 generated.

Join the Conversation