ChatGPT o3 API is now cheaper for developers, and there's no visible impact on performance.
On Wednesday, OpenAI announced it's cutting the price of its best reasoning model, o3, by 80%.

This means o3’s input price is now just $2 per million tokens, while the output price has dropped to $8 per million tokens.
"We optimized our inference stack that serves o3. Same exact model—just cheaper," OpenAI noted in a post on X.
While regular users typically don't use ChatGPT models via API, the price drop makes tools relying on the API much cheaper, such as Cursor and Windsurf.
In a post on X, the independent benchmark community ARC Prize confirmed that the o3-2025-04-16 model’s performance didn’t change after the price reduction.
"We compared the retest results with the original results and observed no difference in performance," the company said.
This confirms that OpenAI did not swap out the o3 model to reduce the price. Instead, the company truly optimized the inference stack that powers the model.
In addition, OpenAI rolled out o3-pro model in the API, which uses more compute to deliver better results.
Why IT teams are ditching manual patch management
Patching used to mean complex scripts, long hours, and endless fire drills. Not anymore.
In this new guide, Tines breaks down how modern IT orgs are leveling up with automation. Patch faster, reduce overhead, and focus on strategic work — no complex scripts required.
Related Articles:
ChatGPT prepares o3-pro model for $200 Pro subscribers
Researchers claim ChatGPT o3 bypassed shutdown in controlled test
OpenAI plans to ship an interesting ChatGPT product by 2026
OpenAI confirms Operator Agent is now more accurate with o3
OpenAI plans to combine multiple models into GPT-5
Original Article Published at Bleeping Computer
________________________________________________________________________________________________________________________________