@theo
Lmao, @OpenRouter tracks latency so well that their routing detected the improvements and instantly prioritized Azure
Azure customers hosting OpenAI models now report 10x improvements in latency and throughput. Sentiment: 50% supportive, 15.5% confronting — mostly positive.
Azure customers hosting OpenAI models should now be seeing a 10x improvement in latency and throughput You're welcome :)
Real-time analysis of public opinion and engagement
What the community is saying — both sides
for fixing deployments and delivering a reported ~10x latency improvement that immediately changed people’s experience.
for agentic workflows and multi-step chains where waits compound.
.
, and that tools (e.g., cursor integration) be made compatible with AI Foundry models.
.
.
.
?”) and ask about more startup credits or programs so bootstrapped teams can actually use these improvements.
many replies are just "lmao" or laughing off the thread, treating the original post as comedy rather than a serious report.
several people say he constantly lodges grievances and is too quick to call out problems.
multiple replies point to slow or unreliable infrastructure (Azure UK South, Foundry, model hosting) and say teams are being forced to move cloud hosts.
users report worse time-to-first-token on certain models (4.1 vs OpenAI, complaints about 200s+ on 5.2 and a ~20% degradation), framing this as a real technical deficiency, not just noise.
people note the lack of blog posts, docs updates, or announcements and treat the claims cautiously because there's no confirmation.
a number of replies say they abandoned the platform because of restrictive access to new models and perceived unnecessary barriers.
some call out the original poster’s manner as entitled or racially charged (e.g., “white man telling the Indians to fix their shit”), pushing back on how complaints are framed.
others ask practical questions: did the poster stop using the API, reproduce the bugs, or file reports—urging concrete troubleshooting rather than just calling out the service.
Most popular replies, ranked by engagement
Lmao, @OpenRouter tracks latency so well that their routing detected the improvements and instantly prioritized Azure
lmao
Theo is getting too strong at lodging complaints. Jetblue? Fetch my bags. Azure? Fix your speed. Anthropic? Suck my balls.
Specifically AI Foundry and GPT-5.2 onwards
lmao
They fixed bugs in their deployments (and yes I get to keep my credits, although I burned a lot of them diagnosing this)
Found something wrong with this article? Let us know and we'll look into it.