. .AI: 5 summaries of big AI news today!

New 30B leader, @ your GPTs, 100X data transformations

Friends, it was a pretty wild day in AI, if your definition of wild includes new benchmark leaders, molecule editing, the ability to use an ensemble of custom GPTs in every ChatGPT session, and 100X faster data transformations (like LLM-powered text clustering and labeling). Mine does!

Here's today's AI news,

Marshall Kirkpatrick, Editor

First impacted: AI researchers, Software developers
Time to impact: Short

Abacus AI has released SMAUG, a 30B class open-source LLM. The model is accessible on Hugging Face and has a score of 76.66 on the <35B MMLU (Massive Multitask Language Understanding) leaderboard. This puts it at the top of the leaderboard for that model size and category. Abacus CEO Bindu Reddy said on X, "Our next goal is to be on top of the leaderboard for ALL open-source models, not just the ones in the 30B class! GPT-4 is in sight!" [abacusai/Smaug-34B-v0.1] Explore more of our coverage of: Abacus AI, SMAUG Language Model, Hugging Face Platform. Share this story by email

First impacted: Scientific researchers, Pharmaceutical researchers
Time to impact: Medium

Scientists have introduced a new model that incorporates textual knowledge into molecular editing with MoleculeSTM, which edits and retrieves molecules using known text and chemical structure data. Prof. Anima Anandkumar says "the core idea of MoleculeSTM is to align the chemical structure and textual description modalities using contrastive pretraining. The pivotal advantage of such alignment is its capacity to introduce a new paradigm of LLM for drug discovery". [Via Anima Anandkumar on LinkedIn] Explore more of our coverage of: MoleculeSTM, Hugging Face platform, Chemical Structure Data. Share this story by email

First impacted: ChatGPT users, OpenAI non-plus users
Time to impact: Short

OpenAI has released a new feature that enables users to integrate CustomGPTs into ChatGPT dialogues via the use of the @ symbol (aka a mention). The feature allows users to bring a CustomGPT into an existing conversation, unlocking domain-specific knowledge or custom actions. Cool detail: you can bring more than one GPT into a conversation! If you’re in need of good GPTs to invoke, I recommend the Consensus research summarizer and my AG Lafley’s 5 Questions Every Good Strategy Should Answer. Try them out, then you can invoke them in subsequent chats. [via @OpenAI] Explore more of our coverage of: OpenAI Updates, GPT Integration, ChatGPT Dialogues. Share this story by email

First impacted: Data Scientists, AI Developers
Time to impact: Short

Lilac AI has launched a cloud service known as Lilac Garden which, according to the company, can improve the speed of AI dataset transformations by up to one hundred times compared to local operations. The service's first offering is an LLM-powered clustering service, which Lilac AI says provides a comprehensive data overview with a single API call, and delivers understandable cluster names and categories from a document list, with the speed attributed to the use of parallel, GPU-accelerated technology. [100x Faster Clustering with Lilac Garden] Explore more of our coverage of: Lilac AI, Cloud Services, AI Dataset Transformation. Share this story by email

First impacted: Python programmers, CUDA programmers, AI developers
Time to impact: Short

Jeremy Howard from Answer.ai has launched a video tutorial on CUDA programming for Python programmers. The guide, designed to be followed along with in a Colab notebook, includes practical examples for tasks such as converting an RGB image to grayscale and matrix multiplication, and how to set up the CUDA environment on various systems. [Getting Started With CUDA for Python Programmers] Explore more of our coverage of: CUDA Programming, Python Tutorials, AI Development. Share this story by email

That’s it! More AI news tomorrow!