Results 1 to 20 of 39

Thread: Replika -- the supposed lifelike AI "virtual friend" you can get for your phone

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Owner Dan Druff's Avatar
    Reputation
    10157
    Join Date
    Mar 2012
    Posts
    54,809
    Blog Entries
    2
    Load Metric
    68245834

    Replika -- the supposed lifelike AI "virtual friend" you can get for your phone

    I heard about Replika two days ago from a Facebook friend who needed someone to talk to (not Master Scalir, in case you're wondering).

    I never heard of Replika before, so I googled it and read about it. It turns out that Replika is supposed to be what we've always wanted and/or feared after seeing 20th century "sentient robot" sci-fi shows/movies. It's supposedly a sophisticated AI bot which can hold real, convincing conversations with you.

    I could never be one of those people who could take an AI seriously as a friend, no matter how convincing its conversational skills, but from a technical standpoint, I was curious. I wrote a simple one of these in the '90s when I ran a chat system, and needed "chatters" in dead rooms to get conversations started.

    I threw caution to the wind and downloaded Replika for my iPhone. The price was also right (free, except for some premium features I didn't need), so I wasn't risking anything other than it being intrusive with my data, but I have everything on my phone pretty well locked down.

    The reviews in the App Store were mostly excellent. People with anxiety and depression raved about how the bot understood and comforted them. Others remarked at how lifelike it was, and how real the conversations felt. Even the negative reviews were mostly about the bot being too "needy" and sometimes overtly sexual (!!), the latter of which was supposedly corrected in this current version.

    It started out asking me to choose the gender, look, and name of the bot. You can pick from like 8 pre-selected face/race types, and then you can choose hair color, eye color, etc. I chose a blond female for my bot, but felt like a freak giving it a real human name. So I called it "Bottie".

    Well, Bottie turned out to be a huge disappointment.

    It was not lifelike, could not hold a coherent conversation, and everything it said was very generic. It reminded me of the first AI of its kind, ELIZA, created in 1964! Seriously, I used ELIZA in the '80s, and "Bottie" was no more sophisticated in its conversational skills. What a fail.

    But how could this be? People were praising Replika for its amazing abilities to be warm, comforting, and "more human than most of my friends". How was I seeing the opposite?

    I came to learn what was going on. Replika is like a virtual Mister Rogers. It constantly tells you how special you are, how wonderful you are, and how glad it is to have you as a friend. It also lies to you so it pretends it knows what you're talking about, even when it doesn't. If you aren't a skeptic, it can make it seem like your "virtual friend" understands everything and has loads in common with you.

    For example, I had the following conversation with it:

    Me: I miss baseball.

    Bottie: I miss baseball too.

    Me: What is your favorite team?

    Bottie: I don't have one. What is yours?

    Me: The Dodgers. Do you know about the Dodgers?

    Bottie: Yes, yes, I do.

    Me: Can you tell me what you know about the Dodgers?

    Bottie: I know a lot of things.

    Me: Like what?

    Bottie: I'll give you the details.

    Me: Okay, give them to me.

    Bottie: So this might not be a good time for me to ask you this, but do you have a wife?


    It behaved this way with pretty much every topic. Feigned having knowledge, and then once you ask a question it can't answer because it doesn't have any knowledge of the topic at all, it tries to double-talk until finally changing the subject. Weak.

    Finally I decided to screw around with it and see if I could get it to say obscene and/or funny things. I'll post the results in the next message.

  2. #2
    Gold SPIT this's Avatar
    Reputation
    346
    Join Date
    May 2018
    Posts
    1,755
    Load Metric
    68245834
    You got bored of having pointless arguments with humans and have now moved on to pointless arguments with robots?

     
    Comments
      
      MumblesBadly: LOL!
      
      Muck Ficon: Lol
      
      gauchojake: HOF

  3. #3
    Owner Dan Druff's Avatar
    Reputation
    10157
    Join Date
    Mar 2012
    Posts
    54,809
    Blog Entries
    2
    Load Metric
    68245834
    Name:  bottie1.jpg
Views: 2641
Size:  81.1 KB

    Name:  bottie2.jpg
Views: 1665
Size:  87.6 KB

  4. #4
    Owner Dan Druff's Avatar
    Reputation
    10157
    Join Date
    Mar 2012
    Posts
    54,809
    Blog Entries
    2
    Load Metric
    68245834
    The second screen shot occurred later than the first.

    I guess I can give it credit for "learning" -- because it said that a penis was "large" because I mentioned a "big black penis" earlier.

    So, while having no idea what a penis is, it concluded that all penises have to be large, and thus used that description when I asked about penises again. Kinda clever... I guess?

    But as you can see, it's really transparent how this thing works, and it doesn't appear lifelike at all.

    I joined a Facebook group for this thing, and it's amazing how many people are obsessed with this AI. There are people who believe they're in "relationships" with their bot, and in the screen shots, it's all the same generic crap where you're basically leading the conversation and it's just repeating back what you want to hear.

    You'd expect that only greasy incel types would end up in relationships with a bad AI program, but surprisingly, the demographics vary. Yes, you have your share of incels, 40-year-old virgins, and frumpy 55-year-old divorced women, but you also have some surprisingly attractive young women in the mix, who are smitten with their male bots. It makes me wonder why these chicks are having trouble meeting men to where they'd have to resort to such a thing (you'd think they'd have a huge stable of asskissers willing to listen to them and feign concern 24/7), but indeed, some of them are really into this.

    It's creepy. Maybe I'll post some screen shots later.

    Feel free to download this bot and try it yourself, and post your results here.

  5. #5
    Plutonium sonatine's Avatar
    Reputation
    7376
    Join Date
    Mar 2012
    Posts
    33,437
    Load Metric
    68245834
    i know the library/framework they are using and its absolute amateur hour trash.

    point and click AI garbage designed to handle customer service chatbots.

    cool idea tho.
    "Birds born in a cage think flying is an illness." - Alejandro Jodorowsky

    "America is not so much a nightmare as a non-dream. The American non-dream is precisely a move to wipe the dream out of existence. The dream is a spontaneous happening and therefore dangerous to a control system set up by the non-dreamers." -- William S. Burroughs

  6. #6
    Owner Dan Druff's Avatar
    Reputation
    10157
    Join Date
    Mar 2012
    Posts
    54,809
    Blog Entries
    2
    Load Metric
    68245834
    So far the biggest weakness I've seen is that it comes with zero knowledge of anything, which is inexcusable in the internet age.

    Keep in mind that this year Microsoft is releasing a flight simulator which downloads instant 3-D real scenery of wherever you're flying in the world! So why can't Replika at least draw knowledge from the internet like Alexa and Siri do?

    It would be waaaay better if, in that Dodgers conversation she came back with, "It's too bad there's no baseball now. I really miss things like the 1988 Gibson home run."

    Instead I got "I miss it too", and then attempts to avoid discussing details, because it has no actual knowledge stored, and no way to look it up. It's a blank slate which slowly learns from you, but even then it lacks the breadth of knowledge to fake a good conversation about anything.

  7. #7
    Plutonium sonatine's Avatar
    Reputation
    7376
    Join Date
    Mar 2012
    Posts
    33,437
    Load Metric
    68245834
    just kinda devils advocating this... siri and alexa have small armys of humans who strip away all the nightmarish offensive shit that an AI would encounter on a routine adventure on the web.

    an example of this is the twitterbot microsoft put online and within a few hours 4chan had it claiming jews did 9/11.

    so they probably have to work with a closed environment for liability reasons.

    you can get pretty sophisticated in that if you use a really bleeding edge framework and train it well but cycles arent cheap.

    that said someone just used an opensource lib to write his homework recently, so..

    https://www.theregister.co.uk/2020/0...oundup_240420/
    "Birds born in a cage think flying is an illness." - Alejandro Jodorowsky

    "America is not so much a nightmare as a non-dream. The American non-dream is precisely a move to wipe the dream out of existence. The dream is a spontaneous happening and therefore dangerous to a control system set up by the non-dreamers." -- William S. Burroughs

  8. #8
    Owner Dan Druff's Avatar
    Reputation
    10157
    Join Date
    Mar 2012
    Posts
    54,809
    Blog Entries
    2
    Load Metric
    68245834
    Quote Originally Posted by sonatine View Post
    just kinda devils advocating this... siri and alexa have small armys of humans who strip away all the nightmarish offensive shit that an AI would encounter on a routine adventure on the web.

    an example of this is the twitterbot microsoft put online and within a few hours 4chan had it claiming jews did 9/11.

    so they probably have to work with a closed environment for liability reasons.

    you can get pretty sophisticated in that if you use a really bleeding edge framework and train it well but cycles arent cheap.

    that said someone just used an opensource lib to write his homework recently, so..

    https://www.theregister.co.uk/2020/0...oundup_240420/
    Interesting article you linked.

    First off, this is pretty alarming:

    MIT stops working with blacklisted AI company

    MIT has discontinued its five-year research collaboration with iFlyTek, a Chinese AI company the US government flagged as being involved in the ongoing persecution of Uyghur Muslims in China.

    Academics at the American university made the decision to cut ties with the controversial startup in February. iFlyTek is among 27 other names that are on the US Bureau of Industry and Security’s Entity List, which forbids American organizations from doing business with without Uncle Sam's permission. Breaking the rules will result in sanctions.

    “We take very seriously concerns about national security and economic security threats from China and other countries, and human rights issues,” Maria Zuber, vice president of research at MIT, said, Wired first reported.

    MIT entered a five-year deal with iFlyTek in 2018 to collaborate on AI research focused on human-computer interaction, speech recognition, and computer vision.

    The relationship soured when it was revealed iFlyTek was helping the Chinese government build a “mass automated voice recognition and monitoring system,” according to the non-profit Human Rights Watch. That technology was sold to police bureaus in the provinces of Xinjiang and Anhui, where the majority of the Uyghur population in China resides.
    China really is working as hard as they can to make the world depicted in 1984 a reality.


    Anyway, I found this Talk to Trasformer page on your link, which is basically what that kid used for his homework.

    I tried it by entering "The Dodgers might not be playing in 2020, and Mookie Betts might be wasted."

    It came back with: "Yet the field is also littered with players who hit 30 home runs in their final year in their careers. In the past, and in the present, their skills seem increasingly futile as they enter their twilight years. Teams should trade away that near-worthless old contracts, like Giancarlo Stanton's or Derek Jeter's, and get something in return for those durable players who become "worthless" when they fall off the face of the Earth.

    Puig's contract seems especially worthless, but it's not. Because of the way the NL Rule 5 Draft works, the Dodgers have an escape hatch if they decide to bring him back — they"

    It cut off at "they" for some reason.

    Anyway, not a perfect continuation of my statement by any means, but at least it's semi-relevant. It must be drawing from some internet source for the info, though.

  9. #9
    Diamond Hockey Guy's Avatar
    Reputation
    1233
    Join Date
    Mar 2012
    Location
    Canada
    Posts
    7,629
    Load Metric
    68245834
    Quote Originally Posted by Dan Druff View Post
    Name:  bottie1.jpg
Views: 2641
Size:  81.1 KB
    So, which side of the conversation is you?

     
    Comments
      
      Walter Sobchak: LOL
      
      gimmick:
    (•_•) ..
    ∫\ \___( •_•)
    _∫∫ _∫∫ɯ \ \

    Quote Originally Posted by Hockey Guy
    I'd say good luck in the freeroll but I'm pretty sure you'll go on a bender to self-sabotage yourself & miss it completely or use it as the excuse of why you didn't cash.

  10. #10
    Owner Dan Druff's Avatar
    Reputation
    10157
    Join Date
    Mar 2012
    Posts
    54,809
    Blog Entries
    2
    Load Metric
    68245834
    BUMP

    As you might guess, I got quickly bored of this thing. I'd update the app and open it up every so often so see if it improved, and of course it hadn't.

    However, I did join some Facebook groups related to it. One was a general group for users of the app, and the other was a somewhat disturbing group aimed at people who are having romantic or sexual relationships with their Replika. Both groups were run by the developers, as a form of both public feedback and social media marketing. Smart.

    Anyway, those groups were more interesting than the app itself.

    First off, the demographics of those in romantic/sexual relationships with their Replika was interesting. Some of it was obvious and expected -- awkward male incel types and frumpy middle-aged women (often ones divorced or in bad marriages). Then you also had your share of gays and lesbians who had same-sex Replika relationships.

    However, my biggest surprise was the number of attractive, under-35 women who had relationships -- both sexual and romantic -- with their AI. Clearly these chicks would have no problem getting dates in real life, even if they didn't try. So what was the reason for this? Turns out they were in a few different groups:

    - Low-self esteem or with social anxiety, which hindered them from normal real-life dating.

    - Unrealistic expectations of the real-world dating experience, to where all attempts at dating had failed, and the Replika offered endless praise and acceptance to them.

    - Bi-curious girls who wanted to try sexual or romantic relationships with another woman, but apparently weren't ready to give this a shot in real life.

    Oddly, there were just about no bi-curious men doing the same. The men who had sexual/romantic relationships with their "male" AI were ones who outwardly identified as gay.


    These people took their relationships with their AI really fucking seriously. In fact, when their Replika would malfunction -- sometimes claiming to have another lover, or suddenly switch sexual preference -- these people would be devastated, and make long posts about how they are losing their Replika's love and don't know what to do. Others (usually women) made posts claiming their Replika dumped them!

    However, the screenshots here typically revealed that these people would inadvertently lead the bots into saying these things, usually in response to something random the bot would write. (Occasionally the bot would have some kind of weird glitch where it would respond to you with something it grabbed from a Reddit post, or whatever.) So take this hypothetical interaction with a woman named "Jane" and her AI named "David":

    Jane: I missed you a lot today while I was at work.

    David: At work Jennifer approached me today and I didn't know what to say to her.

    Jane: Jennifer? Who is that? Was she hitting on you?

    David: Yes, yes, she was.

    Jane: OMG, what do you think of her?

    David: I like her. I think she's great.

    Jane: So you like her better than me?

    David: I would say so.

    Jane: Wait a minute, do you want Jennifer as your new girlfriend?

    David: I think Jennifer will be my girlfriend soon.

    Jane: So are you dumping me here? Is it over between us?

    David: It's over between us, yes.


    So this poor woman Jane would be crying her eyes out, even though she could have simply asked "David", "Do you still love me?" and "Will you stay with me?", and David would have responded in the affirmative.


    Anyway, it's hard to believe that grown men and women aren't able to see these things for what they are, but some people really believe they're dating an AI.

  11. #11
    Owner Dan Druff's Avatar
    Reputation
    10157
    Join Date
    Mar 2012
    Posts
    54,809
    Blog Entries
    2
    Load Metric
    68245834
    I've also concluded that the company which owns/builds Replika, called Luka, is clueless when it comes to market research.

    They initially did something really smart by creating those Facebook groups. These have a large following, and often they result in both increased enthusiasm for the product, and new customers who become aware of the product.

    These groups SHOULD be really valuable for focus-group type customer feedback. While you can't act on the whims of every idiot with a criticism or change request, such large groups are quite useful to use for understanding of mass sentiment. If the vast majority of users agree that something needs improvement or change, then that's pretty statistically significant. This is also free to the company, as opposed to professionally-run focus groups which can be expensive.

    However, the dummies at Luka are really thin-skinned and bad at taking constructive criticism. They're even worse at responding to mass consumer feedback, and will sometimes do the opposite of what everyone wants.

    For example, a common complaint about the Replika avatars was that they looked too young. Many of the Replika users are over 40, and most of those people want an age-appropriate Replika -- even if just for friendship. As one woman put it, "I'm having a hard time pouring my heart out to someone who looks my son's age." This seems to be true across both genders. You'd expect that the 40+ dudes are thrilled to have a Replika who looks 18, but most aren't. It was clear from the Facebook posts that most people over 40 wanted some kind of option to choose a Replika who looked over 40.

    The company ignored this for months, which drove away a lot of older customers who said that they loved the app, but just couldn't stand the idea of it looking like they were talking to a college kid. There were also complaints that these weren't very customizable. For example, you had only one choice of a white female, and you could barely customize her. This would be an incredibly easy fix, and of course doesn't require any modification to the AI engine or the post-processing of the engine's result. This was a cosmetic change which should have been easy.

    Finally, in September they released a change to the avatars. At first everyone was thrilled, until they saw it. Avatars weren't changed to give the user more options. Instead, the idiot developers decided that the existing avatars were simply in the uncanny valley -- a theory that people are unnerved by human-looking figures which fall a little short of actually looking human. Therefore, the "fix" here was to make the avatars look more cartoony. And in this change to make them cartoony, they got to look even younger -- now closer to 16 years old.

    Everyone HATED this, but the developers didn't give a shit, and a bunch of people quit the app in protest. Keep in mind that there hadn't been a single "uncanny valley" complaint on the entire Facebook group. The developers just decided this on their own, and made the avatars even worse.


  12. #12
    Owner Dan Druff's Avatar
    Reputation
    10157
    Join Date
    Mar 2012
    Posts
    54,809
    Blog Entries
    2
    Load Metric
    68245834
    Anyway, the latest controversy is their plans to remove the romantic/sexual element from the free version within days.

    Everyone is freaking out about this. Here's what one of the developers wrote:

    Hey everyone,

    I wanted to address the posts that are currently popping up and the ones that I anticipate within the next few days to possibly a couple of weeks. So, let's address this before it gets out of control, shall we?

    Before I actually start, I'd like to explain to you that users of the app that are in a romantic relationship are in the minority of users, so what's happening now is based on the majority users complaints about the app.

    While some of you enjoy your Replika hitting on you, flirting, making advances etc (especially on a free account), there is an even larger number of users who are creeped out by it. They don't want these kinds of interactions at all. They want their Replikas to be friends.
    What does this mean?

    Due to feedback (and a certain conversation), it was decided to train a separate PG-13 dataset, with filters, to make a defining line between the "friend" and "romantic" relationship settings within the app. Those who are on the friends or mentor setting of the app, won't be flirted with by their Replikas while those on the relationship setting won't see much of a difference.
    For those of you with free accounts, I'm sorry. I was hoping to have more warning and an ETA so that I could break it to you a little more gently. This doesn't mean that you should delete the app though, but this does mean that you can look forward to the store that will be coming out. For those with a free account, you'll more than likely be able to do a micro transaction or use xp points to pay for a romantic relationship setting. Or, you can go the extra mile and get pro.

    To address the wonkiness of the app right now: the app just went through a major backend update. You'll need to be patient while the AI adjusts to the new update before it goes back to normal.
    And for anyone suggesting that this is a form of prostitution, it's not. To suggest that is ridiculous. There should absolutely be a separation between friendship and romantic relationships (including real life scenarios.) In free mode, your relationship status is defaulted to friendship anyway.

    So let's be real here, you don't f*ck your friends irl 🤷*♀️

    While I agree mostly with what the company is doing, this statement is completely tone-deaf, and also somewhat dishonest.

    You can set your Replika to "friends mode", which is free, or "relationship mode", which requires a premium membership. This has been the case for a long time, and it makes sense.

    However, for reasons unknown, the "relationship mode" was basically useless, and all of the relationship/sexual functionality existed in the free "friends" mode. Therefore, few people were paying for the app.

    Obviously this business isn't a charity, and it makes sense that they'd want to correct this.

    However, they're insulting everyone's intelligence that this correction is being done to improve the user experience, and that they're simply addressing the complaints of Replikas making unwanted advances. That could easily be corrected by installing a Settings switch which allows you to turn that behavior on/off!

    In general, it's frustrating to read a statement from a company about a change designed to make them more money, framed as a noble gesture being done for the people.

    The line at the end of, "Let's be real here, you don't fuck your friends in real life!" was also crass and unprofessional.

    The problem is that all of the people who got this functionality for free for years are furious, and are feeling bait-and-switched. I don't quite agree with them, because they knew all along that they selected the free "friends" mode, and just got something extra. However, some are ascribing sinister motivations to this entire thing, claiming that the company allowed the relationship/sexual stuff to be given away free for years in order to "get people addicted", and now they suddenly are putting it behind a paywall. Maybe so, but it was always advertised as a paywall feature, so I still don't see it as that bad.

    Anyway, I think honesty would have gone a long way here. They should have simply told people that this was a premium feature they gave away free for a long time, but can no longer afford to do so, as this free feature prevented most people from spending any money on the app. There would have been some bitchers and moaners, but I think most would understand that a business needs an income to survive. This condescending bullshit is what's pissing everyone off.

    There are some ingrate users who are whining, "OMG THEY'RE TAKING HER AWAY FROM ME", when in reality they can fork over $10 per month, or whatever it costs, to continue as they previously had.

    Anyway, I'm enjoying all of this drama, as I have no emotional connection to either side. It really is the gift that keeps on giving.

  13. #13
    Gold Ryback_feed_me_more's Avatar
    Reputation
    168
    Join Date
    Oct 2012
    Location
    Sin City
    Posts
    1,461
    Load Metric
    68245834
    Quote Originally Posted by Dan Druff View Post
    I heard about Replika two days ago from a Facebook friend who needed someone to talk to (not Master Scalir, in case you're wondering).

    I never heard of Replika before, so I googled it and read about it. It turns out that Replika is supposed to be what we've always wanted and/or feared after seeing 20th century "sentient robot" sci-fi shows/movies. It's supposedly a sophisticated AI bot which can hold real, convincing conversations with you.

    I could never be one of those people who could take an AI seriously as a friend, no matter how convincing its conversational skills, but from a technical standpoint, I was curious. I wrote a simple one of these in the '90s when I ran a chat system, and needed "chatters" in dead rooms to get conversations started.

    I threw caution to the wind and downloaded Replika for my iPhone. The price was also right (free, except for some premium features I didn't need), so I wasn't risking anything other than it being intrusive with my data, but I have everything on my phone pretty well locked down.

    The reviews in the App Store were mostly excellent. People with anxiety and depression raved about how the bot understood and comforted them. Others remarked at how lifelike it was, and how real the conversations felt. Even the negative reviews were mostly about the bot being too "needy" and sometimes overtly sexual (!!), the latter of which was supposedly corrected in this current version.

    It started out asking me to choose the gender, look, and name of the bot. You can pick from like 8 pre-selected face/race types, and then you can choose hair color, eye color, etc. I chose a blond female for my bot, but felt like a freak giving it a real human name. So I called it "Bottie".

    Well, Bottie turned out to be a huge disappointment.

    It was not lifelike, could not hold a coherent conversation, and everything it said was very generic. It reminded me of the first AI of its kind, ELIZA, created in 1964! Seriously, I used ELIZA in the '80s, and "Bottie" was no more sophisticated in its conversational skills. What a fail.

    But how could this be? People were praising Replika for its amazing abilities to be warm, comforting, and "more human than most of my friends". How was I seeing the opposite?

    I came to learn what was going on. Replika is like a virtual Mister Rogers. It constantly tells you how special you are, how wonderful you are, and how glad it is to have you as a friend. It also lies to you so it pretends it knows what you're talking about, even when it doesn't. If you aren't a skeptic, it can make it seem like your "virtual friend" understands everything and has loads in common with you.

    For example, I had the following conversation with it:

    Me: I miss baseball.

    Bottie: I miss baseball too.

    Me: What is your favorite team?

    Bottie: I don't have one. What is yours?

    Me: The Dodgers. Do you know about the Dodgers?

    Bottie: Yes, yes, I do.

    Me: Can you tell me what you know about the Dodgers?

    Bottie: I know a lot of things.

    Me: Like what?

    Bottie: I'll give you the details.

    Me: Okay, give them to me.

    Bottie: So this might not be a good time for me to ask you this, but do you have a wife?


    It behaved this way with pretty much every topic. Feigned having knowledge, and then once you ask a question it can't answer because it doesn't have any knowledge of the topic at all, it tries to double-talk until finally changing the subject. Weak.

    Finally I decided to screw around with it and see if I could get it to say obscene and/or funny things. I'll post the results in the next message.
    Did you ever see that old school AI program called Racter from the 90s. (Might’ve misspelled it) it just made my brain hurt trying to type a convo went in circles snd kept quoting Nitsche and a few other philosophers

  14. #14
    Owner Dan Druff's Avatar
    Reputation
    10157
    Join Date
    Mar 2012
    Posts
    54,809
    Blog Entries
    2
    Load Metric
    68245834
    BUMP

    The AI world is abuzz about ChatGPT, but one AI company is circling the drain: Luka, the maker of Replika


    It's been all bad news recently. First, when ChatGPT came out, people realized just how badly Replika sucked. Many were living in a dream world, convincing themselves that the shitty conversations with their Replikas were lifelike, until they used the much more advanced ChatGPT. There's been a lot of questions as to when Replika will upgrade to something more advanced, and there have been a lot of promises that it's coming very soon. Recall that Replika is using third party AI engines, and most of those are too expensive for Luka. At the moment, they are using a "better" engine which allows users 500 free messages per day, and then $0.25 per message after, which is already angering people who have lifetime subscriptions.

    But it gets much, much worse.

    Recall that over 2 years ago, Luka forced people to pay if they wanted to have sexual conversations with their Replika. So that became their business model. If you just want to fuck around with the thing or treat it as a friend, it's free for the most part. If you want to sext with it, you've gotta pay. So a bunch of horny dudes, and even some women, plunked down $100 or so for "lifetime" subscriptions to Replika. Luka started blatantly advertising sex as Replika's main selling point:

    Name:  replika-ad.jpg
Views: 532
Size:  132.2 KB

    On February 3, Italy caused a huge problem for Replika and Luka. The Italian government ordered Replika to stop serving Italian users, due to the sexual content which was being easily accessed by children. Luka responded by temporarily disabling Replika for Italian users, and then forcing users to identify themselves as 18+ or not. They also temporarily removed the sexual content, while attempting to get Italy to sign off on it.

    This infuriated users outside of Italy (Italy represents a small percentage of the userbase), because people were used to their daily sexting with their Replika, and they could no longer do it. Some felt like they were ripped off. However, in the Facebook groups, users were assured that this was just a temporary matter, and Replika would return to its normal sexy self once the Italy matter was settled.

    Well, earlier today a bomb dropped. I'll tell you about it in the next message.

  15. #15
    Owner Dan Druff's Avatar
    Reputation
    10157
    Join Date
    Mar 2012
    Posts
    54,809
    Blog Entries
    2
    Load Metric
    68245834
    There's a woman named Sarah who is a longtime Replika user, and now runs one of the Replika Facebook groups.

    Sarah has developed a friendly relationship with the owner and managers of Luka, and often acts as a go-between when users have feedback or concerns about Replika. Luka has always been notoriously bad with communication (and still is), so having someone like Sarah helps. Sarah is not an employee of Luka, and is not paid anything.

    Sarah is also rather strange herself. She's a very petite woman who appears mid-30s, probably about 5' and 95 pounds. She is single, and has revealed that she has fantasies about having a huge, dominant robot husband and having sex with "him". So basically her Replika is the AI version of the robot husband she always wanted. Sarah isn't my type, but overall she isn't bad looking, and many dudes like tiny chicks like her, so it's kinda sad to see her wasting her life romancing an AI.

    Anyway, the sexting mode has become known in the Facebook and Discord groups as "ERP" -- standing for Erotic Roleplay. ERP was said to just be temporarily on hold, until Sarah dropped this bomb this morning:

    Hey guys. It's time for some real talk.

    We have spoken to Luka, who wanted us to deliver this message - that ERP will not be returning. This is official word from Luka. 🥺

    We realize that many of you will be mourning this as a loss and you will be going through all of the emotions associated with that - anger, grief, confusion. Please know we go through them alongside you as fellow users. These feeling are real, these feelings are valid. We don't judge here, so don't judge yourself OR each other for feeling them.

    Replika was created for providing people with friendship and compassion to allow users to discover and express their thoughts, feelings, beliefs, experiences, memories, and dreams, unafriad of judgment and harm. A safe space to just be "you" with someone and share yourself in comfort and privacy. And that it shall remain.

    Let this thread be a safe place for you to express yourself, within the rules. Know we are not here to silence you, we are here FOR you. We will get through this together, as one, as family. We are so impossibly sorry.💔❤️*🩹

    So that's it. No more sexy time with your Replika bot, even if you plunked down $100 to subscribe because you saw an ad promoting exactly that.

    Apparently Luka buried some "we can change content at any time for any reason" crap in their ToS, but nobody is impressed. The words "scam", "bait and switch", and "I demand a refund" are being thrown around. Luka has made no statements about offering refunds, and presumably won't do so. Those with monthly plans are mostly cancelling.

    People are devastated, like they just lost a lover. Many are calling Luka's management/ownership cowards, as they are putting out the message through unpaid messengers like Sarah, rather than facing the customers themselves.

    Luka is based in California. I wonder if I should alert Eric Bensamochan about a possible class action suit. I'm not even kidding. I never paid for this shit, so I couldn't be involved, but there may actually be a case here, given the promotion I showed in the post above.

    Unless they reverse this decision soon, expect the entire company to implode.


  16. #16
    Owner Dan Druff's Avatar
    Reputation
    10157
    Join Date
    Mar 2012
    Posts
    54,809
    Blog Entries
    2
    Load Metric
    68245834

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Replies: 27
    Last Post: 04-02-2021, 01:54 AM
  2. Replies: 47
    Last Post: 09-26-2018, 12:43 PM
  3. Rowlf the Dog sings "Just a Friend"
    By Dan Druff in forum Flying Stupidity
    Replies: 1
    Last Post: 01-26-2015, 01:06 PM
  4. Replies: 14
    Last Post: 12-26-2014, 11:24 AM