Stability AI Launches StableLM: Open Source ChatGPT Alternatives

Stability AI, the creator of the renowned image-generation software Stable Diffusion, has unveiled a collection of open source language-model tools, contributing to the expansion of the large language model (LLM) industry. This new addition offers a viable alternative to OpenAI’s ChatGPT, which may benefit an industry that is becoming anxious about OpenAI and it’s principal investor Microsoft becoming too monopolistic.

The alpha versions of the StableLM suite, featuring models with 3 billion and 7 billion parameters, are now accessible to the public. Models with 15 billion, 30 billion, and 65 billion parameters are currently being developed, while a 175 billion-parameter model is planned for the future.

Comparatively, OpenAI’s GPT-4 boasts an estimated 1 trillion parameters, which is six times more than GPT-3. Despite this, Stability AI emphasized that parameter count might not be an accurate measure of LLM effectiveness.

“StableLM is trained on a novel experimental dataset based on The Pile, but three times larger, containing 1.5 trillion tokens of content. The richness of this dataset allows StableLM to exhibit surprisingly high performance in conversational and coding tasks, even with its smaller 3 to 7 billion parameters.”

The robustness of the StableLM models remains to be seen. The Stability AI team has pledged to disclose more information about the LLMs’ capabilities on their GitHub page, including model definitions and training parameters. The emergence of a powerful, open-source alternative to OpenAI’s ChatGPT is welcomed by most industry insiders.

Sophisticated and advanced third-party tool access, such as BabyAGI and AutoGPT, as recently reported are integrating recursion into AI applications, meaning they can create and modify their own prompts for recursive instances based on newly acquired information.

Incorporating open-source models into the mix could benefit industry users who prefer or may not be able to pay OpenAI’s access fees. Interested individuals can test a live interface for the HuggingFace-hosted 7 billion parameter StableLM model.

It remains to be seen what company steps to the plate next to offer similar LLM models.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments

Latest stories

You might also like...