πŸ¦™ Mistral 7B & Ollama: LLMs πŸ’ Apache 2.0 Open Source on small hardwares

adriens - Oct 19 '23 - - Dev Community

πŸ’­ About

I recently felt on this news:

Mistral AI makes its first large language model free for everyone | TechCrunchMistral AI makes its first large language model free for everyone | TechCrunch

Mistral, a French AI startup that raised a huge seed round in June, has debuted its first model, totally free to download and use.

favicon techcrunch.com

The two key points that kept my attention were:

the model was released under the Apache 2.0 license,...

and

Mistral 7B is a further refinement of other β€œsmall” large language models like Llama 2, offering similar capabilities (according to some standard benchmarks) at a considerably smaller compute cost

Then a few weeks later the release of the "GenAI Stack":

πŸ‘‰ So I started to look around... and find the benefits I could take out of these tools.

πŸ’­ Then & now

Actually my only two options were:

  • Use my OpenAI account and use gpt-3.5-turbo flavours or gpt-4 to achieve custom projects (and share my data with OpenAI)
  • Use Kaggle GPUs resources & play with resources for free... still with the need to share (at least temporarly my resources) with Kaggle

🎯 What we'll achieve

With these two announcements, I wondered what I could achieve locally on my own hardware:

Image description

πŸ‘‰ This blog post is dedicated to the unboxing of this stack and the amazing perspectives it unleashes... and why Open Source matters.

🍿 Demo

πŸ’‘ Load any custom model into ollama

Below a short an easy trick on how to install a finetuned custom model into ollama : jackalope-7b (a fine tuned version of Mistral-7B).

πŸ”– Bookmarks

More about GenAI stack

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .