Circling back from my initial thoughts on GPT-3, I’ve come across the great work that EleutherAI has been working on and I wanted to note it below for posterity.
EleutherAI defines itself as a “grassroots collective of researchers working to open-source AI research.”
They currently have two main models available - GPT-J, a 6 Billion parameter model, and GPT-NeoX-20B, a 20 Billion parameter model. For reference, GPT-3 is ~175 Billion parameters. EleutherAI is currently working on releasing a model the size of GPT-3.
With every accuracy measurement, size correlates with higher results, i.e. more parameters mean higher accuracy. However, GPT-J and GPT-NeoX-20B can be fine-tuned with sample data. This tuning can help boost performance to equal GPT-3’s most accurate model.
One flaw that comes with developing GPT-3 is the cost. In my experience, you’d want to use Da Vinci (their most expensive model) to tune your input data, but a few queries can quickly exceed a side project budget.
Overall, having an open source competitor for GPT-3 is terrific for the consumer.