View profile

#44: One month in, GPT-3-powered OpenAI API demos take the web by storm

Hey everyone, welcome to Dynamically Typed #44. Today I'll be covering just one productized AI story—
Dynamically Typed
#44: One month in, GPT-3-powered OpenAI API demos take the web by storm
Hey everyone, welcome to Dynamically Typed #44. Today I’ll be covering just one productized AI story—about the crazy demos that people have been building using OpenAI’s GPT-3 API—for two reasons. First, a lot has already been built with it, so there’s a ton to cover. Second, I’m on holiday, so I have a lot of very important swimming and biking around to do this weekend. Let’s jump right in.

Productized Artificial Intelligence 🔌
OpenAI is expanding access to its API powered by GPT-3, the lab’s latest gargantuan language model. As I wrote in last month’s DT #42, what makes GPT-3 special is that it can perform a wide variety of language tasks straight out of the box, making it much more accessible than its predecessor, GPT-2:
For example, if you feed it several questions and answers prefixed with “Q:” and “A:” respectively, followed by a new question and “A:”, it’ll continue the passage by answering the question—without ever having to update its weights! Other example include parsing unstructured text data into tables, improving English-language text, and even turning natural language into Bash terminal commands (but can it do git?).
At the time, only a few companies (like Casetext, MessageBird and Quizlet) and researchers (like Janelle Shane) had access to the API. But the rest of us could sign up for a waitlist, and over the past few weeks OpenAI has started sending out invites. I’ve collected some of the coolest demos here, roughly grouped by topic. I know it’s a lot of links, but many of these are definitely worth a look! They’re all very impressive, very funny, or both.
A big group of projects generate some form of code.
Two other projects imitate famous writers.
Another set of projects restructures text into new forms.
  • Another experiment by Andrew Mayne can transform a movie script into a story (and the reverse). I found this demo particularly impressive: the story also includes a lot of relevant and interesting details tha were not in the original script.
  • Francis Jervis had GPT-3 turn plain language into legal language. For example, “My apartment had mold and it made me sick” became “Plaintiff’s dwelling was infested with toxic and allergenic mold spores, and Plaintiff was rendered physically incapable of pursing his or her usual and customary vocation, occupation, and/or recreation.” (More here.)
  • Mckay Wrigley built a site called Learn From Anyone, where you can ask Elon Musk to teach you about rockets, or Shakespeare to teach you about writing.
Some projects are about music.
  • Arram Sabeti used GPT-3 for a bunch of different things, including generating songs: he had both Lil Wayne and Taylor Swift write songs called “Harry Potter,” with great results. (The blog post also contains a fake user manual for a flux capacitor and a fake essay about startups on Mars by Paul Graham.)
  • Sushant Kumar got the API to write vague but profound-sounding snippets about music. For example, “Innovation in rock and roll was often a matter of taking a pop melody and playing it loudly.” And, “You can test your product by comparing it to a shitty product it fixes. With music, you can’t always do that.” (It also generates tweets for blockchain, art, or any other word.)
And finally, some projects did more of the fun prompt-and-response text generation we saw from GPT-2 earlier:
GPT-3 generating episode titles and summaries for the Connected podcast.
GPT-3 generating episode titles and summaries for the Connected podcast.
I also got my own invite to try GPT-3 for This Episode Does Not Exist!, my project to generate fake episode titles and summaries for my favorite podcasts, like Connected and Hello Internet. It used to work by fine-tuning GPT-2 on metadata of all previous episodes of the show for 600 to 1,000 epochs, a process that took about half an hour on a p100 GPU on Colab. Now, with GPT-3 I can simply paste 30ish example episodes into the playground (more is beyond the input character limit), type “Title:”, and GPT-3 generates a few new episodes—no retraining required! Once I get a chance to wrap this into a Python script, it’ll become so much easier for me to add new podcasts and episodes to the website.
That’s it for today. I definitely found these demos something to chew on—the opportunities but also the risks that’ll come with wide availability of OpenAI’s GPT-3 API are enormous. If any of you have thoughts about this stuff, do reach out and let me know.
Thanks for reading! As usual, you can let me know what you thought of today’s issue using the buttons below or by replying to this email. If you’re new here, check out the Dynamically Typed archives or subscribe below to get a new issues in your inbox every second Sunday.
If you enjoyed this issue of Dynamically Typed, why not forward it to a friend? It’s by far the best thing you can do to help me grow this newsletter. 🏝
Did you enjoy this issue?
Leon Overweel (Dynamically Typed)

My thoughts and links on productized artificial intelligence, machine learning technology, and AI projects for the climate crisis. Delivered to your inbox every second Sunday.

If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue