Settings

Mixtral-8x22B-Instruct-v0.1

Mixtral 8x22B is currently the most performant open model. A 22B sparse Mixture-of-Experts (SMoE). Uses only 39B active parameters out of 141B.

Model website

System Prompt