Skip to content
  • Status
  • Announcements
  • Docs
  • Support
  • About
  • Partners
  • Enterprise
  • Careers
  • Pricing
  • Privacy
  • Terms
  •  
  • © 2025 OpenRouter, Inc

    Qwen: Qwen3 235B A22B Instruct 2507

    qwen/qwen3-235b-a22b-2507

    Created Jul 21, 2025262,144 context
    $0.071/M input tokens$0.463/M output tokens

    Qwen3-235B-A22B-Instruct-2507 is a multilingual, instruction-tuned mixture-of-experts language model based on the Qwen3-235B architecture, with 22B active parameters per forward pass. It is optimized for general-purpose text generation, including instruction following, logical reasoning, math, code, and tool usage. The model supports a native 262K context length and does not implement "thinking mode" (<think> blocks).

    Compared to its base variant, this version delivers significant gains in knowledge coverage, long-context reasoning, coding benchmarks, and alignment with open-ended tasks. It is particularly strong on multilingual understanding, math reasoning (e.g., AIME, HMMT), and alignment evaluations like Arena-Hard and WritingBench.

    Recent activity on Qwen3 235B A22B Instruct 2507

    Total usage per day on OpenRouter

    Prompt
    925M
    Completion
    79.9M

    Prompt tokens measure input size. Reasoning tokens show internal thinking before a response. Completion tokens reflect total output length.