GPT-3 demos: one month in
OpenAI is expanding access to its API powered by GPT-3, the lab’s latest gargantuan language model. As I wrote in last month’s DT #42, what makes GPT-3 special is that it can perform a wide variety of language tasks straight out of the box, making it much more accessible than its predecessor, GPT-2:
For example, if you feed it several questions and answers prefixed with “Q:” and “A:” respectively, followed by a new question and “A:”, it’ll continue the passage by answering the question—without ever having to update its weights! Other example include parsing unstructured text data into tables, improving English-language text, and even turning natural language into Bash terminal commands (but can it do git?).
At the time, only a few companies (like Casetext, MessageBird and Quizlet) and researchers (like Janelle Shane) had access to the API. But the rest of us could sign up for a waitlist, and over the past few weeks OpenAI has started sending out invites. I’ve collected some of the coolest demos here, roughly grouped by topic. I know it’s a lot of links, but many of these are definitely worth a look! They’re all very impressive, very funny, or both.
A big group of projects generate some form of code.
- Sharif Shameem prompted the API with a description of a website layout, and his app then generated and rendered the layout as JSX code. A little while later, he got it to generate functioning React apps.
- Harley Turan also used GPT-3 to generate React components based on variables names. Sonny Lazuardi did something similar, integrating React component generation into Figma.
- Components AI showed GPT-3 a few examples of words and emojis with corresponding hexadecimal color scales. It could then generate new color scales based on emoji. My favorites are “smoke” and “🦋”.
Two other projects imitate famous writers.
- AI|Writer by Andrew Mayne “creates simulated hypothetical correspondence with famous personalities, both real and fictitious.” You email an address that the app provides with “Dear…” to a historical figure, and a little while later it emails back with “their” response. My favorite example so far is a conversation between the Hulk and Bruce Banner.
- Nick Cammarata had GPT-3 generate poetry: Richard Feynman in the style of Robert Frost.
Another set of projects restructures text into new forms.
- Another experiment by Andrew Mayne can transform a movie script into a story (and the reverse). I found this demo particularly impressive: the story also includes a lot of relevant and interesting details tha were not in the original script.
- Francis Jervis had GPT-3 turn plain language into legal language. For example, “My apartment had mold and it made me sick” became “Plaintiff’s dwelling was infested with toxic and allergenic mold spores, and Plaintiff was rendered physically incapable of pursing his or her usual and customary vocation, occupation, and/or recreation.” (More here.)
- Mckay Wrigley built a site called Learn From Anyone, where you can ask Elon Musk to teach you about rockets, or Shakespeare to teach you about writing.
Some projects are about music.
- Arram Sabeti used GPT-3 for a bunch of different things, including generating songs: he had both Lil Wayne and Taylor Swift write songs called “Harry Potter,” with great results. (The blog post also contains a fake user manual for a flux capacitor and a fake essay about startups on Mars by Paul Graham.)
- Sushant Kumar got the API to write vague but profound-sounding snippets about music. For example, “Innovation in rock and roll was often a matter of taking a pop melody and playing it loudly.” And, “You can test your product by comparing it to a shitty product it fixes. With music, you can’t always do that.” (It also generates tweets for blockchain, art, or any other word.)
And finally, some projects did more of the fun prompt-and-response text generation we saw from GPT-2 earlier:
- Mario Klingemann, creator of the Memories of Passersby I AI art project (see DT #9), has been playing with GPT-3 a lot. He asked it why did the chicken cross the road?, had it fill in the blanks in stories, and asked it to make boring sentences sound more interesting.
- Sid Bharath asked GPT-3 what the purpose of life is. “Life is a beautiful miracle. Life evolves through time not greater forms of beauty. In that sense, the purpose of life is to increase the beauty of universe.” (The conversation goes on from there.)
- Janelle Shane had GPT-3 generate fake facts about whales, in the form of bullet points and Wikipedia auto-completes.
- Kevin Lacker gave GPT-3 a Turing test.
GPT-3 generating episode titles and summaries for the Connected podcast.
I also got my own invite to try GPT-3 for This Episode Does Not Exist!, my project to generate fake episode titles and summaries for my favorite podcasts, like Connected and Hello Internet. It used to work by fine-tuning GPT-2 on metadata of all previous episodes of the show for 600 to 1,000 epochs, a process that took about half an hour on a p100 GPU on Colab. Now, with GPT-3 I can simply paste 30ish example episodes into the playground (more is beyond the input character limit), type “Title:”, and GPT-3 generates a few new episodes—no retraining required! Once I get a chance to wrap this into a Python script, it’ll become so much easier for me to add new podcasts and episodes to the website.