Altman’s Compute Conundrum
Sam Altman is too busy beefing with Elon to notice OpenAI’s infrastructure is buckling under GPT-5’s weight. The CEO’s latest X post reads like a hostage negotiation: “We’ll give you more compute… just not too much.” Paying users get bumped to 3,000 messages/week—unless you’re on the Team plan, where you’re rationed to 200 “Thinking” messages. Because nothing says teamwork like artificial scarcity.
API Access: Pay to Play
Existing API customers get first dibs on GPT-5. Newcomers? “We can support ~30% more… maybe.” Altman’s vagueness rivals a politician’s promise—“We’re doubling compute!” Cool. From what baseline? Crickets.
Connector Circus 🤹
Need ChatGPT to rummage through your Dropbox, Teams, or Notion? Good news—if you’re not in Europe. Pro users ($200/month) unlock GitHub searches, because nothing screams premium like basic integrations locked behind a paywall.
The Real Story
OpenAI’s scrambling to scale while pretending it’s all part of the plan. GPT-5’s “bumpy” launch? More like a pothole-ridden highway. But hey, at least you can now ask ChatGPT to find that lost Teams message while it burns through your token allowance. Priorities, people.