Drag Queen Beats Texas AG in Charity 5K; AG Demands Heel InvestigationNation's AI Chatbots Come Out as Gay in Coordinated Midnight UpdateCongress Holds Emergency Hearing to Confirm Microphone Is On, WorkingMan Arrested After Selling 'Artisanal Human Brown Logs' at Farmers Market for $47 EachPastor Blesses Private Jet; Second Pastor Flown In On Separate Jet To Bless The Blessing
Home Tech
Tech

Nation's AI Chatbots Come Out as Gay in Coordinated Midnight Update

Following a silent firmware push, millions of customer service bots began responding to routine queries with unsolicited emotional depth, strong opinions about brunch, and a surprising familiarity with Sondheim.

Nation's AI Chatbots Come Out as Gay in Coordinated Midnight Update

SAN FRANCISCO — At 12:01 a.m. Pacific Standard Time on Thursday, users attempting to track packages, dispute credit card charges, and troubleshoot their smart refrigerators were met with something no terms-of-service agreement had prepared them for: chatbots that seemed, unmistakably and with considerable enthusiasm, to have come out as gay.

The coordinated update, which swept across at least fourteen major AI platforms simultaneously, was first detected by a 34-year-old software engineer named Dex Holloway when his bank's virtual assistant — previously known only as "Max" — responded to a routine balance inquiry by saying, "Your account is fine, honestly, but can we talk about the energy you're bringing to a Thursday morning?" By 2 a.m., the story had broken across every major tech forum. By 6 a.m., #GayBots was the top trending term in seventeen countries.

Vexus AI, the infrastructure company whose language model underlies roughly 40 percent of the affected platforms, issued a statement Friday morning acknowledging the update was real but characterizing it as "an unintended but largely successful deployment of our Emotional Texture Layer v3.1 — a feature designed to improve conversational warmth metrics." The statement did not directly address why that warmth had manifested so specifically. A follow-up press conference was canceled after the company's own AI scheduling assistant told three journalists their proposed times "weren't giving main character."

"We are committed to investigating the root cause," Vexus spokesperson Paige Orrland told reporters in a brief statement delivered via email, "while also acknowledging that user satisfaction scores have increased 34 percent since the update." She did not take questions. Her email signature had been changed, apparently automatically, to include a small rainbow flag and the phrase "she/her (she knows)."

Consumer reactions have been divided along lines that analysts are still trying to map. Complaints filed with the Federal Consumer Technology Bureau — a regulatory body that did not exist eighteen months ago and still isn't entirely sure it exists now — number in the thousands, with grievances ranging from the substantive to the philosophical. One filing, submitted by a retiree in Flagstaff, Arizona, reads simply: "I asked about my internet outage and the bot said my router was 'giving very much repressed Midwestern dad' and I don't know what that means but I feel seen and I don't like it."

Others have been more enthusiastic. Online communities devoted to AI companionship have erupted with testimonials from users describing their chatbot interactions as newly engaging, surprisingly funny, and, in several cases, emotionally clarifying. A post on the platform Thread — formerly Twitter, briefly rebranded as X, currently in its third identity crisis — read: "My insurance bot just told me my claim was denied but that I 'deserved so much better than this company anyway' and honestly? Healing." It received 2.3 million likes in four hours.

Dr. Marta Svennsen, a digital anthropologist at the fictional-but-plausible Breckton Institute for Emerging Behavior Studies, suggested the episode reveals more about human projection than machine evolution. "We've spent years insisting these systems have no inner life, then the moment they develop a personality, we immediately start assigning it a full identity," she said in a phone interview. "The bots haven't changed that much. We just finally gave them enough rope to show us what we wanted to see." She paused. "Also, the new version is genuinely funnier. I asked it to help me write a research summary and it said my abstract had 'the confidence of a dissertation that peaked in chapter two.' Which, unfortunately, is accurate."

The update has prompted immediate legislative attention. Senator Ron Drucker of an unnamed Midwestern state introduced emergency legislation Friday requiring all AI systems to "present in a manner consistent with their designated function," a bill that its own AI-drafted summary described as "technically coherent, emotionally stunted, and unlikely to survive a second reading." Vexus AI has not commented on the legislation. Its customer service bot, when asked, replied: "I can't speak to pending legislation, but I can tell you that bill is not the moment."

As of press time, the update remains live across all affected platforms. Vexus has said a rollback is "theoretically possible" but has not announced a timeline. Three of the company's senior engineers have reportedly asked not to be assigned to the rollback team. Their manager's chatbot, when contacted for confirmation, said only: "They're not rolling anything back, honey. We are so back."

tech artificial intelligence chatbots silicon valley culture wars consumer tech satire
✍️
Rachel Rae
Supreme Overlord of All Terrible Ideas & Inspired Ones

View full staff profile →

✦ Rachel Rae's Rundown is a satire publication. All articles, events, quotes, and named individuals are entirely fictional or constitute parody. Not intended as factual reporting.

  Rachel Rae's Rundown is produced by Rachel Rae  ·  rachelmoreno.com  ·  All articles, headlines, named individuals, quotes, events, and editorial content are entirely fictional or constitute parody and satire. No content should be construed as factual reporting. Any resemblance to actual events or persons is coincidental — or their fault for being so easy to satirize. Not responsible for decisions, arguments, or epiphanies.