Settings

Mixtral-8x7B-Instruct-v0.1

A 7B sparse Mixture-of-Experts (SMoE). Uses 12.9B active parameters out of 45B total.

Model website

System Prompt