First thoughts with GPT-3

14 August 2020
14 Aug 2020
2 min read

I recently received access to one of the hottest betas in tech - access to GPT-3. I’ve had a chance to play around with it and wanted to note down some first impressions.

  • The onboarding is well designed. There’s a playground option that automatically completes any text you enter, making it great for non-techies.
  • Additionally, I’d recommend watching the onboarding videos on how to design prompts. You can make magic happen, but it sometimes takes a second to get the prompt right.
  • The staff has been great at answering questions.
  • As has been mentioned before, it’s essential to think about the second-degree effects that come with people being to generate text of any type. Yes, it’s not revolutionary, but how much text needs to be? Most SEO articles just need to hit a word count as an example.
  • The module (in my experience) struggles somewhat with longer-form text. There’s been talk on how this is due to the limited memory of the module, but I wouldn’t plan on using this for pages of text.
  • Additionally, the model is much better at nonfiction than fiction. I think this is because there’s less fiction text out there. This is where I see the best opportunity is to fine tune the model.

I plan on posting a development post using GPT-3 at some point but wanted to write down my thoughts. I believe that this model is the start of something massive. I’ve had pretty great results with minimal cherry-picking and already can’t wait for GPT-4.

Want to know more?

I spend a ton of time thinking on how to work smarter, not harder. If you'd like to be the first to know what I'm thinking about, sign up to the list below.


Trading Options in Robinhood

A Better Text Editor