. . AI: The Future of Fine-Tuning is Debated (3.26.24)

OpenPipe, Fine-tuning, Stability AI

Friends, today's edition is lighter on content which is weird compared to the usual break-neck speed things have been moving at. One story received significant attention, but due to the unverified authenticity, we've omitted it from today's edition. In the actual news today, we see mixed signals on the future of fine-tuning and a new model released from Stability AI that can run on a MacBook Air! Happy reading!

Marshall Kirkpatrick, Editor

First impacted: AI developers, Open-source model trainers
Time to impact: Short

Kyle Corbitt, the CEO of OpenPipe which is a platform for developers to fine-tune LLMs, announced that the company has secured $6.7M in seed funding. The company also states that users of its platform, who train open-source models, retain ownership of the weights and have "saved over $7M while lowering latency and improving quality by switching from GPT-4 to fine-tuned models". The next story might have something interesting to say about this space though! [We Raised $6.7M to Replace GPT-4 with Your Own Fine-Tuned Models - OpenPipe] Explore more of our coverage of: OpenPipe, Seed Funding, Open-Source Models. Share this story by email

First impacted: AI experts, AI development teams
Time to impact: Medium

AI consultant Hamel Husain has collated and shared some industry commentary on the role of fine-tuning, with some stating it provides only slight improvements to models compared to larger generalist models. He shared comments from a number of industry figures and the posts propose that most teams will opt for generalist models that are easier and faster to monetize. [via @HamelHusain] Explore more of our coverage of: AI Fine-Tuning, Model Optimization, AI Expert Opinions. Share this story by email

First impacted: Software Engineers, Programmers
Time to impact: Short

Stability AI has launched Stable Code Instruct 3B, a new LLM tailored for tasks such as code generation and software engineering outputs. The company stated that its performance rivals models of similar or larger sizes, including Codellama 7B Instruct and DeepSeek-Coder Instruct 1.3B. Emad Mostaque, the former CEO who recently announced his resignation to pursue the decentralized movement, also commented that the new model can even run on a MacBook Air! [Stable Code Instruct 3b - a Hugging Face Space by stabilityai] Explore more of our coverage of: Stability AI, Code Generation, Software Engineering. Share this story by email