Kimi K2 is a Mixture-of-Experts (MoE) foundation model with 1 trillion total parameters and 32 billion activated parameters. Outperforms leading open-source models across general knowledge reasoning, programming, mathematics, and agent tasks. Context length 256K with automatic context caching, ToolCalls, JSON Mode, Partial Mode, and internet search support.
Try NowCoding and agent tasks
Need strong foundation model
General-purpose MoE generation
262,144 tokens
16,384 tokens
$0.60 per 1M tokens
$2.50 per 1M tokens
$0.15 per 1M tokens
$15 per 1K calls
$0.19 per 1K calls