• Character.AI won't let its chatbots get romantic with teenagers a

    From TechnologyDaily@1337:1/100 to All on Saturday, December 14, 2024 00:30:05
    Character.AI won't let its chatbots get romantic with teenagers anymore

    Date:
    Sat, 14 Dec 2024 00:00:00 +0000

    Description:
    Character.AI adds new safety features for AI chatbots.

    FULL STORY ======================================================================

    Character.AI has a new set of features aimed at making interactions with the virtual personalities it hosts safer, especially for teenagers. The company just debuted a new version of its AI model specifically designed for its younger users, as well as a set of parental controls to manage their time on the website. The updates follow earlier safety changes to the platform in the wake of accusations that the AI chatbots were negatively impacting the mental health of children.

    These safety changes have been accompanied by other efforts to tighten the reins on Character.AI's content. The company recently began a purge, albeit
    an incomplete one, of any AI imitations of copyrighted and trademarked characters .

    For teen users, the most noticeable change will likely be the division
    between the adult and teen versions of the AI model. You have to be 13 to
    sign up for Character.AI, but users under 18 will be directed to a model with narrower guardrails specifically built to prevent romantic or suggestive interactions.

    The model also has better filters for what the user writes and is better at noting when a user attempts to bypass those limits. That includes a new restriction on editing responses from the chatbot to sneak around the suggestive content restriction. The company is keen on keeping any conversations between teenagers and its AI personalities PG. In addition, if
    a conversation touches on topics like self-harm or suicide, the platform will pop up a link to the National Suicide Prevention Lifeline to help guide teens to professional resources.

    Character.AI is also working to keep parents in the loop about what their teenagers are doing on the website, with controls set to come out early next year. The new parental controls will give parents insight into how much time their kids spend on the platform and which bots theyre chatting with the
    most. To make sure these changes hit the right notes, Character.AI is working with several teen online safety experts. Disclaimer AI

    It's not just teenagers that Character.AI is looking to help maintain a sense of reality. Theyre also tackling concerns about screen time addiction, with all users getting a reminder after they've been talking to a chatbot for an hour. The reminder nudges them to take a break.

    The existing disclaimers about the AI origins of the characters are also getting a boost. Instead of just a small note, you'll see a longer
    explanation about them being AI. That's especially true if any of the
    chatbots are described as doctors, therapists, or other experts. A new extra warning makes it crystal clear that the AI isnt a licensed professional and shouldnt replace real advice, diagnosis, or treatment. Imagine a big yellow sign saying, Hey, this is fun and all, but maybe dont ask me for
    life-changing advice.

    "At Character.AI, we are committed to fostering a safe environment for all
    our users. To meet that commitment we recognize that our approach to safety must evolve alongside the technology that drives our product creating a platform where creativity and exploration can thrive without compromising safety," Character.AI explained in a post about the changes. "To get this right, safety must be infused in all we do here at Character.AI. This suite
    of changes is part of our long-term commitment to continuously improve our policies and our product." You might also like... Character.AI institutes new safety measures for AI chatbot conversations Why your favorite fictional AI friends are vanishing from Character.AI Character.ai lets you talk to your favorite (synthetic) people on the phone which isn't weird at all



    ======================================================================
    Link to news story: https://www.techradar.com/computing/artificial-intelligence/character-ai-wont- let-its-chatbots-get-romantic-with-teenagers-anymore


    --- Mystic BBS v1.12 A47 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)