Qwen3-235B-A22B is a 235B parameter mixture-of-experts (MoE) model developed by Qwen, activating 22B parameters per forward pass. Supports seamless switching between a "thinking" mode for complex reasoning, math, and code tasks, and a "non-thinking" mode for general conversational efficiency. Demonstrates strong reasoning ability, multilingual support (100+ languages and dialects), advanced instruction-following, and agent tool-calling capabilities.
Pricing
Pay-as-you-go rates for this model. More details can be found here.
Input Tokens (1M)
$0.05
Output Tokens (1M)
$0.30
Capabilities
Input Modalities
Text
Output Modalities
Text
Supported Parameters
Available parameters for API requests
Frequency Penalty
Logit Bias
Logprobs
Max Completion Tokens
Presence Penalty
Response Format
Stop
Temperature
Tool Choice
Tools
Top P