Kimi K2 is a Mixture-of-Experts (MoE) foundation model with 1 trillion total parameters and 32 billion activated parameters. Outperforms leading open-source models across general knowledge reasoning, programming, mathematics, and agent tasks. Context length 256K with automatic context caching, ToolCalls, JSON Mode, Partial Mode, and internet search support.
Try Now262,144 tokens
16,384 tokens
$0.60
$2.50
$0.15
$15
$0.19