Pricing, Performance & Features Comparison
Kimi-latest-128k refers to the Kimi K2 model, a state-of-the-art Mixture-of-Experts (MoE) language model with 32 billion activated and 1 trillion total parameters. It features a 128K context length and is meticulously optimized for agentic capabilities, specifically designed for tool use, reasoning, and autonomous problem-solving.
Kimi-k2-0711-preview is a version of the Kimi K2 language model developed by Moonshot AI. It is a mixture-of-experts model with 32 billion activated parameters and 1 trillion total parameters, optimized for agentic tasks to act, execute, and reason through complex, tool-driven processes. The model is designed for general-purpose chat and autonomous task execution with enhanced coding capabilities.