A highly performant 32B multilingual model designed to rival monolingual performance through innovations in instruction tuning with data arbitrage, preference training, and model merging. Serves 23 languages including Arabic, Chinese, Japanese, Korean, and major European languages. With 128K context window, it handles substantial multilingual workloads effectively.
Try Now128,000 tokens
4,000 tokens
$0.50
$1.50
Multilingual research model; superseded by Command A