GPT Update

30 April 2022
30 Apr 2022
1 min read

Circling back from my initial thoughts on GPT-3, I’ve come across the great work that EleutherAI has been working on and I wanted to note it below for posterity.

EleutherAI defines itself as a “grassroots collective of researchers working to open-source AI research.” They currently have two main models available - GPT-J, a 6 Billion parameter model, and GPT-NeoX-20B, a 20 Billion parameter model. For reference, GPT-3 is ~175 Billion parameters. EleutherAI is currently working on releasing a model the size of GPT-3.

With every accuracy measurement, size correlates with higher results, i.e. more parameters mean higher accuracy. However, GPT-J and GPT-NeoX-20B can be fine-tuned with sample data. This tuning can help boost performance to equal GPT-3’s most accurate model.

One flaw that comes with developing GPT-3 is the cost. In my experience, you’d want to use Da Vinci (their most expensive model) to tune your input data, but a few queries can quickly exceed a side project budget.

Overall, having an open source competitor for GPT-3 is terrific for the consumer.

Want to know more?

I spend a ton of time thinking on how to work smarter, not harder. If you'd like to be the first to know what I'm thinking about, sign up to the list below.


Using an iPad for Real Work