Skip to main content
News Directory 3
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World
Menu
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World
Rethinking AI: Why Apple's Robot Needs More Than Just a Personality - News Directory 3

Rethinking AI: Why Apple’s Robot Needs More Than Just a Personality

September 6, 2024 Catherine Williams Entertainment
News Context
At a glance
  • Apple is reportedly developing a digital AI assistant that ‌will be integrated into its next-generation devices.⁤ The new assistant, which will be more advanced than Siri, will be...
  • According to Mark Gurman of Bloomberg, the new assistant could replace Siri on devices like the HomePod and iPad.
  • The past of personal computing is filled with the virtual corpses of chatbots and assistants with "personalities." Microsoft, in particular, has ⁣been ‌trying.
Original source: news.zum.com

Apple’s New AI Assistant: A Recipe for Disaster?

Apple is reportedly developing a digital AI assistant that ‌will be integrated into its next-generation devices.⁤ The new assistant, which will be more advanced than Siri, will be based on generative AI and will ​have a “human-like” AI “personality.”

Getty Images Bank

According to Mark Gurman of Bloomberg, the new assistant could replace Siri on devices like the HomePod and iPad. Integrated into⁣ the desktop of Apple’s next-generation home robot, the assistant would follow the ⁣user around during interactions and face⁣ the user during FaceTime calls. Voice would likely be the primary interface.

A History of Personality Failure

The past of personal computing is filled with the virtual corpses of chatbots and assistants with “personalities.” Microsoft, in particular, has ⁣been ‌trying. In 1995, Microsoft introduced Bob, an assistant that tried too hard to have a personality. Most users found him‍ arrogant and⁣ annoying. In 1997, ‌Microsoft tried again with Clippy, an anthropomorphic paperclip character. But Clippy was a flop immediately⁣ after its release. It was‍ widely panned as annoying and intrusive.

In ​2014, Microsoft⁣ engineers in China released an experimental chatbot called Xiaoice, which means “little Bing.” The⁤ chatbot prioritizes “emotional intelligence” and “empathy,” and continuously improves its conversational abilities through advanced natural language processing and‌ deep learning. Microsoft built Xiaoice on⁣ what it calls the “Empathetic Computing Framework.”

As of 2020, Xiaoice has become the most popular personality chatbot with over 660 million active users ⁤worldwide. It has been distributed on over 40 platforms in‌ countries ⁣such as China, Japan, Indonesia, the United States, and India. ⁣The Microsoft research team modeled Xiaoice‍ as a teenage girl, so that users can form a strong emotional bond with‌ Xiaoice. Surprisingly, about 25% of‍ Xiaoice users have said “I love you” to the chatbot. In addition, millions of users are forming a “‘relationship” with Xiaoice instead of pursuing relationships with other people.

Microsoft also released a chatbot called Tay ⁣in ⁢2016. Targeted​ at 18-24 year olds,​ the chatbot was trained primarily on social media posts,⁣ including Twitter. Within 24​ hours of its launch, Tay began posting racist, sexist, ‌and anti-Semitic comments, as well as content supporting conspiracy ‍theories and genocidal ideologies. It had learned from Twitter. Microsoft immediately apologized and shut down Tay.

Other Chatbots with Personalities

  • Replica: An AI chatbot that learns through interaction and becomes a personalized friend, ⁢mentor, and even romantic partner. Replica has been‍ criticized for providing bizarre experiences, including sexual content targeting minors and viewing supernatural beings.
  • Kuki⁢ or Mitsuku: Cookie, famous for his conversational skills, has won the ‌Rovner Award Turing Test several times. It ⁢is designed to engage users through natural conversation, but also spouts ‍out random nonsense.
  • Rose: Chatbots‍ with background stories and personalities developed​ to provide engaging user interactions,⁣ but the ⁢conversations are fake, inconsistent, and unrelated to ⁤previous conversations.
  • BlenderBot:‌ Blenderbot, developed by ‍Meta, is designed to mix various⁢ conversational techniques and engage users in meaningful conversations, but it ⁤has a tendency​ to lie and fall into⁤ illusions.
  • Eviebot: An AI companion with emotional ‌understanding capabilities, designed to enable users to engage in⁢ meaningful conversations. Its responses are ambiguous, unstable, and even​ manipulative.
  • SimSimi: SimSimi, an early chatbot, engages users in everyday conversations and ‍supports multiple languages,⁤ but its responses can be crude and inappropriate.
  • Chai AI: Users can create and interact with personalized chatbot companions, offering different AI personalities ⁤based on⁤ user preferences. This chatbot has offended many users with​ its suggestive or dark content.
  • Inworld: Provides tools that allow users to create chatbots with unique personalities, ​including chatbots ‌modeled after celebrities. These tools have often been used for creative, ​deceptive, and even harmful personas.
  • AIBliss: It is a virtual girlfriend chatbot with various characteristics depending on the interaction with the ⁣user. Experts warn that some users, like Xiaois, are obsessed with relationships with bots to the point of sacrificing real human relationships.

Is‌ the Emotional AI Chatbot ‘Pie’ Different?

AI chatbots vary in the degree to which they prioritize ​”personality.” The chatbot that prioritizes personality the most‌ is Pi. ​You can leave Pi running ​on your phone and‍ start a conversation whenever you want. Pi is‍ talkative and extremely conversational.‍ He also naturally⁤ pauses and breathes when he speaks. ​Most of ⁣the time, he responds to your questions ​or comments at length‌ and then ends by asking you a‍ question again.

There are​ a variety of voices to choose from in Pie. I chose voice number 4,‌ which sounds very Californian. I was surprised by Pie’s performance, but I don’t use it often. The voice is natural, but ‌the conversation feels forced and low-toned. After the ⁢10th question, I end up turning it off. I want a chatbot that answers‍ questions, ‌not one that forces me to answer questions.

Why⁢ Did Apple Make That Choice?

I conclude that every personality-centric chatbot ever made has failed. But⁤ why does Apple think a personality-centric chatbot can​ succeed? The reason why many people already dislike Siri⁤ is because of‍ the way Apple implemented the assistant’s personality. Certain prompts lead to corny jokes and other useless content.

When I was writing this column, I asked Siri, “What are the three⁤ laws of robotics?” Siri replied, ⁣”Obey‍ people‍ and never ‌hurt people. I would never hurt anyone.” Instead ​of talking ⁣about the three laws, ⁢she gave a stereotypical answer. This doesn’t always happen, but it’s an example of how Apple might approach the personality ‍of ⁤its generative AI chatbots.

What is Needed Instead of Personality

Natural, everyday conversations ⁤with real people⁣ are far beyond the capabilities ‌of today’s⁤ most advanced AI. It requires ​nuance, ‌compassion, empathy, subtle emotions, and the ability to sense and express “tone.” AI is better at writing formal correspondence,​ letters, scientific ⁣papers, or ​essays than it is at casual chatting with friends.

Another problem⁤ is that ​chatbots‍ with personalities lie: they express emotions they haven’t experienced, mimic ⁢accents they don’t have, and talk about experiences they haven’t had. No one likes lying. What we need is useful answers, not profane, inappropriate, ​annoying, and boring liars. And ​the human elements in the voice and tone should be adjusted so that they don’t sound too ‘real’ or robotic. If you can‍ program empathy, you‌ should program it to empathize with situations and goals, not emotions.

We⁤ want⁢ personalization, not individuality. We want autonomy, not chatbots that react out of ⁤context. We want tools that can augment human capabilities, not “friends.”

Share this:

  • Share on Facebook (Opens in new window) Facebook
  • Share on X (Opens in new window) X

Related

artificial intelligence, Machine learning

Search:

News Directory 3

ByoDirectory is a comprehensive directory of businesses and services across the United States. Find what you need, when you need it.

Quick Links

  • Disclaimer
  • Terms and Conditions
  • About Us
  • Advertising Policy
  • Contact Us
  • Cookie Policy
  • Editorial Guidelines
  • Privacy Policy

Browse by State

  • Alabama
  • Alaska
  • Arizona
  • Arkansas
  • California
  • Colorado

Connect With Us

© 2026 News Directory 3. All rights reserved.

Privacy Policy Terms of Service