OpenAI is expanding access to its API powered by GPT-3,
the lab’s latest gargantuan language model. As I wrote in last month’s DT #42
, what makes GPT-3 special
is that it can perform a wide variety of language tasks straight out of the box, making it much more accessible than its predecessor, GPT-2:
For example, if you feed it several questions and answers prefixed with “Q:” and “A:” respectively, followed by a new question and “A:”, it’ll continue the passage by answering the question—without ever having to update its weights! Other example include parsing unstructured text data into tables, improving English-language text, and even turning natural language into Bash terminal commands (but can it do git?).
At the time, only a few companies (like Casetext, MessageBird and Quizlet) and researchers (like Janelle Shane) had access to the API. But the rest of us could sign up for a waitlist, and over the past few weeks OpenAI has started sending out invites. I’ve collected some of the coolest demos here, roughly grouped by topic. I know it’s a lot of links, but many of these are definitely worth a look! They’re all very impressive, very funny, or both.
A big group of projects generate some form of code.
Two other projects imitate famous writers.
Another set of projects restructures text into new forms.
- Another experiment by Andrew Mayne can transform a movie script into a story (and the reverse). I found this demo particularly impressive: the story also includes a lot of relevant and interesting details tha were not in the original script.
Francis Jervis had GPT-3 turn plain language into legal language. For example, “My apartment had mold and it made me sick” became “Plaintiff’s dwelling was infested with toxic and allergenic mold spores, and Plaintiff was rendered physically incapable of pursing his or her usual and customary vocation, occupation, and/or recreation.” (More here.)
Mckay Wrigley built a site called Learn From Anyone, where you can ask Elon Musk to teach you about rockets, or Shakespeare to teach you about writing.
Some projects are about music.
Arram Sabeti used GPT-3 for a bunch of different things, including generating songs: he had both Lil Wayne and Taylor Swift write songs called “Harry Potter,” with great results. (The blog post also contains a fake user manual for a flux capacitor and a fake essay about startups on Mars by Paul Graham.)
Sushant Kumar got the API to write vague but profound-sounding snippets about music. For example, “Innovation in rock and roll was often a matter of taking a pop melody and playing it loudly.” And, “You can test your product by comparing it to a shitty product it fixes. With music, you can’t always do that.” (It also generates tweets for blockchain, art, or any other word.)
And finally, some projects did more of the fun prompt-and-response text generation we saw from GPT-2 earlier: