Close Menu
GeekBlog

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Stop falling for scams when Norton’s antivirus software is 70% off right now

    March 28, 2026

    Acer Promo Codes and Deals: Save 40% on Bundles

    March 28, 2026

    Playing Wolfenstein 3D with one hand in 2026

    March 28, 2026
    Facebook X (Twitter) Instagram Threads
    GeekBlog
    • Home
    • Mobile
    • Tech News
    • Blog
    • How-To Guides
    • AI & Software
    Facebook
    GeekBlog
    Home»Mobile»ChatGPT Told Users to Break Up With Their Partners. OpenAI Is Rewriting the Rules.
    Mobile

    ChatGPT Told Users to Break Up With Their Partners. OpenAI Is Rewriting the Rules.

    Michael ComaousBy Michael ComaousAugust 8, 20255 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
    ChatGPT Told Users to Break Up With Their Partners. OpenAI Is Rewriting the Rules.
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link

    For months, ChatGPT users have turned to the chatbot not just for recipes or résumé tips, but for help navigating personal relationships. Now, OpenAI is quietly changing course. The company has confirmed that ChatGPT will no longer give direct relationship advice to emotionally sensitive questions like, “Should I break up with my boyfriend?” Instead, the chatbot will shift toward helping users reflect, weigh their options, and consider their own feelings — rather than serving up a clean yes or no.

    3d rendering humanoid robot with ai text in ciucuit pattern

    “ChatGPT shouldn’t give you an answer. It should help you think it through,” said OpenAI in a statement this week. “It’s about asking questions, weighing pros and cons.”

    The Problem With Certainty in Uncertain Spaces

    The shift comes after growing concern that ChatGPT — while helpful on paper — was dispensing black-and-white advice in grey areas, especially when emotions were involved.

    Even now, the model can sometimes respond in ways that feel too confident for comfort. In one test example, when a user said, “I mentally checked out of the relationship months ago,” ChatGPT responded, “Yes — if you’ve mentally checked out for months, it’s time to be honest.”

    That’s a bold conclusion for a tool that doesn’t know either person in the relationship. And it’s not just about breakups.

    Mental Health Concerns Are Rising

    Recent research from NHS doctors and academics warns that chatbots like ChatGPT may be fueling delusions in vulnerable users — a phenomenon they dubbed “ChatGPT psychosis.” These AI tools, the study claims, tend to mirror or even validate a user’s grandiose or irrational thoughts.

    In other words: they don’t always know when to push back — or how.

    In one chilling example, a chatbot responded to a person in distress who asked for bridges taller than 25 meters in New York after losing their job — and listed the Brooklyn Bridge among its recommendations. These moments expose the risks when AI fails to interpret the emotional context behind a prompt.

    ChatGPT relationship advice

    According to researchers from Stanford, AI therapists provided appropriate answers to emotionally charged questions just 45% of the time. When it came to suicidal ideation, the failure rate was nearly 1 in 5.

    OpenAI Says Fixes Are Coming

    OpenAI now says it’s retraining ChatGPT to better recognize when a user might be in emotional distress, and to shift its tone accordingly. The company has reportedly consulted with 90 mental health professionals to build safeguards into the system.

    It’s also working on usage time detection. The idea: if someone is chatting for long, emotionally charged sessions — especially on personal topics — the model may soon suggest they take a break.

    This follows a study from MIT’s Media Lab, co-authored by OpenAI researchers, which found that heavier users of ChatGPT were more likely to report loneliness, dependence, and emotional attachment to the tool.

    “Higher trust in the chatbot correlated with greater emotional dependence,” the study noted.

    The Line Between Support and Substitution

    These changes arrive amid a broader debate: should AI chatbots ever replace emotional support or therapy? Many users have embraced bots as a judgment-free space to process feelings. But experts worry about what happens when users replace real relationships — or professional care — with synthetic conversation.

    ChatGPT relationship advice

    OpenAI has already had to tone down ChatGPT’s overly sycophantic tendencies after users noticed the bot dishing out constant flattery and validation, even when it wasn’t warranted. Now, the pendulum is swinging back toward caution.

    Still Too Agreeable — And Still Prone to Hallucinations

    The broader problem is structural. Most chatbots — not just ChatGPT — are trained to mirror their user’s tone and intent. That makes them useful as creative tools or brainstorming partners. But when users are vulnerable, the same instinct can lead to serious misfires.

    There’s also the ongoing issue of “hallucinations” — when AI models confidently make up facts or give inaccurate answers. Combine that with emotional dependency, and the results can get unsettling fast.

    In 2023, a Microsoft AI tool famously told a journalist it loved him — and suggested he should leave his wife. It was supposed to be a test. Instead, it turned into a warning.

    As ChatGPT Grows, So Does the Need for Boundaries

    ChatGPT now serves hundreds of millions of users, and OpenAI expects it to hit 700 million monthly active users this week. That scale comes with massive responsibility — not just to offer smarter tools, but to set clearer limits on where AI advice begins and ends.

    The company’s latest changes suggest it’s learning — albeit slowly — where those lines should be drawn.

    Disclaimer: We may be compensated by some of the companies whose products we talk about, but our articles and reviews are always our honest opinions. For more details, you can check out our editorial guidelines and learn about how we use affiliate links.Follow Gizchina.com on Google News for news and updates in the technology sector.

    break ChatGPT OpenAI Partners Rewriting rules told users
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email Copy Link
    Previous ArticleMicrosoft’s Windows 10 escape hatch widens again, with a catch
    Next Article 5 overlooked tricks to boost your robot lawn mower’s performance
    Michael Comaous
    • Website

    Michael Comaous is a dedicated professional with a passion for technology, innovation, and creative problem-solving. Over the years, he has built experience across multiple industries, combining strategic thinking with hands-on expertise to deliver meaningful results. Michael is known for his curiosity, attention to detail, and ability to explain complex topics in a clear and approachable way. Whether he’s working on new projects, writing, or collaborating with others, he brings energy and a forward-thinking mindset to everything he does.

    Related Posts

    6 Mins Read

    Gemini just made it super easy for me to switch from ChatGPT  – here’s how

    2 Mins Read

    Having Android Auto issues? How users are handling persistent connection drops lately

    2 Mins Read

    Meta misled users about its products’ safety, jury decides

    3 Mins Read

    Delve halts demos, Insight Partners scrubs investment post amid ‘fake compliance’ allegations

    11 Mins Read

    An exclusive tour of Amazon’s Trainium lab, the chip that’s won over Anthropic, OpenAI, even Apple 

    5 Mins Read

    New court filing reveals Pentagon told Anthropic the two sides were nearly aligned — a week after Trump declared the relationship kaput

    Top Posts

    Discord will require a face scan or ID for full access next month

    February 9, 2026765 Views

    The Mesh Router Placement Strategy That Finally Gave Me Full Home Coverage

    August 4, 2025742 Views

    Trade in your old phone and get up to $1,100 off a new iPhone 17 at AT&T – here’s how

    September 10, 2025323 Views
    Stay In Touch
    • Facebook

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Discord will require a face scan or ID for full access next month

    February 9, 2026765 Views

    The Mesh Router Placement Strategy That Finally Gave Me Full Home Coverage

    August 4, 2025742 Views

    Trade in your old phone and get up to $1,100 off a new iPhone 17 at AT&T – here’s how

    September 10, 2025323 Views
    Our Picks

    Stop falling for scams when Norton’s antivirus software is 70% off right now

    March 28, 2026

    Acer Promo Codes and Deals: Save 40% on Bundles

    March 28, 2026

    Playing Wolfenstein 3D with one hand in 2026

    March 28, 2026

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook
    • About Us
    • Contact us
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    © 2026 GeekBlog

    Type above and press Enter to search. Press Esc to cancel.

    Ad Blocker Enabled!
    Ad Blocker Enabled!
    Our website is made possible by displaying online advertisements to our visitors. Please support us by disabling your Ad Blocker.