Hunyuan-A13B is a 13B active parameter Mixture-of-Experts (MoE) language model developed by Tencent, with a total parameter count of 80B and support for reasoning via Chain-of-Thought. It offers competitive benchmark performance across mathematics, science, coding, and multi-turn reasoning tasks, while maintaining high inference efficiency via Grouped Query Attention (GQA) and quantization support (FP8, GPTQ, etc.).
Try NowEfficient moe reasoning task
Math or science with thinking
Budget multi-turn reasoning
131,072 tokens
131,072 tokens
$0.14 per 1M tokens
$0.57 per 1M tokens
$15 per 1K calls
$0.19 per 1K calls