Skip to main content
Glama

jamba-1-5-mini vs phi-3.5-mini-128k-instruct

Pricing, Performance & Features Comparison

Price unit:
Authorai21
Context Length256K
Reasoning
-
Providers1
ReleasedAug 2024
Knowledge CutoffMar 2024
License-

ai21/jamba-1-5-mini is a general-purpose instruction-following text generator that can handle tasks like question answering, summarization, and sentiment analysis in multiple languages. It supports a large 256K token context window and can be used for both research and commercial applications. The model leverages an advanced SSM-Transformer architecture and can produce grounded, structured outputs such as JSON.

Input$0.2
Output$0.4
Latency (p50)-
Output Limit4K
Function Calling
JSON Mode
InputText
OutputText
in$0.2out$0.4--
Authormicrosoft
Context Length128K
Reasoning
-
Providers1
ReleasedAug 2024
Knowledge CutoffOct 2023
LicenseMIT License

Phi-3.5-mini-128k-instruct is a compact, advanced language model that can handle up to 128K tokens in context, enabling tasks such as lengthy document summarization, multi-turn conversation, and complex reasoning. It is optimized for logic, code, and math, and provides robust multi-lingual capabilities. With 3.8 billion parameters, it uses a dense decoder-only transformer architecture that balances performance and efficiency.

Input$0.1
Output$0.1
Latency (p50)-
Output Limit128K
Function Calling
-
JSON Mode
-
InputText
OutputText
in$0.1out$0.1--