Microsoft

Token usage over time

Browse models from Microsoft

2 models

WizardLM 2 8x22B

69K Tokens

WizardLM-2 8x22B is Microsoft AI's most advanced Wizard model, demonstrating highly competitive performance compared to leading proprietary models and consistently outperforming state-of-the-art open-source models. It is an instruct finetune of Mixtral 8x22B and is optimized for complex reasoning and instruction-following tasks. For more information, see the [official release](https://wizardlm.github.io/WizardLM2/).

byMicrosoft
$0.25/1M input tokens$0.25/1M output tokens

Phi-4

10K Tokens

Phi-4 is a 14B-parameter model from Microsoft Research, designed for complex reasoning tasks and efficient operation in low-memory or rapid-response scenarios. Trained on a mix of high-quality synthetic and curated data, it is optimized for English language inputs and demonstrates strong instruction following and safety standards. For more details, see the [Phi-4 Technical Report](https://arxiv.org/pdf/2412.08905).

byMicrosoft
$0.04/1M input tokens$0.07/1M output tokens