OMISIMO
New Chat
Settings
_Models
Omisimo by OVH
Active
Mistral 7B by MistralAI
Mixtral 8x7B by MistralAI
Mixtral 8x22B by MistralAI
CodeLLama 13B
Mamba Codestral 7B
Llama3 70B by Meta
Llama3 8B by Meta
Application Settings
Mixtral-8x7B-Instruct-v0.1
A 7B sparse Mixture-of-Experts (SMoE). Uses 12.9B active parameters out of 45B total.
Model website
Copy direct link to model
Copied
New chat
System Prompt