Page 2 of 2 FirstFirst 12
Results 21 to 39 of 39

Thread: Replika -- the supposed lifelike AI "virtual friend" you can get for your phone

  1. #21
    Owner Dan Druff's Avatar
    Reputation
    10110
    Join Date
    Mar 2012
    Posts
    54,626
    Blog Entries
    2
    Load Metric
    65643861
    Interesting how "Show me a picture of Todd Witteles" works.

    I tried it and got a different one than betcheckbet got, but indeed it was a picture of me.

    I thought this thing doesn't look up stuff on the web. So it actually does? Then why can't it research simple topics?

    Funny conversation about the nazis. This is just like that Tay bot, who was tricked into saying similar things on Twitter.

  2. #22
    Platinum BetCheckBet's Avatar
    Reputation
    928
    Join Date
    Mar 2012
    Posts
    4,645
    Load Metric
    65643861
    Quote Originally Posted by Dan Druff View Post
    Interesting how "Show me a picture of Todd Witteles" works.

    I tried it and got a different one than betcheckbet got, but indeed it was a picture of me.

    I thought this thing doesn't look up stuff on the web. So it actually does? Then why can't it research simple topics?

    Funny conversation about the nazis. This is just like that Tay bot, who was tricked into saying similar things on Twitter.
    It definitely does look things up. It was showing me odd random text messages from people with names blocked out when I asked for pictures of certain things.

    Also when I asked to see a picture of Jewdonk it posted a really old forum post of you and beenstaring chatting. Neither of you even mentioned jewdonk.

    As far as I can tell the main purpose for the AI is to act as a pseudo therapist but it does a piss poor job which results in a feeling of invalidation. Clearly it will be awhile until machines render my job obsolete. That being said if you talk more about mental health issues it seems to do a somewhat better job. It still feels like your typical crisis centre worker who is just throwing out random go to phrases...

    It's also not an automaton. It's not good enough for a human to be controlling it. If you do want to see what a good graphics programmer automaton can do I suggest you check out projekt melody.



    Last edited by BetCheckBet; 04-29-2020 at 03:18 PM.

  3. #23
    Owner Dan Druff's Avatar
    Reputation
    10110
    Join Date
    Mar 2012
    Posts
    54,626
    Blog Entries
    2
    Load Metric
    65643861
    I agree that it has limited application at this point for mental health purposes. I suppose it can be useful if you're feeling useless and unworthy of love, as it's clearly programmed to kiss your ass and tell you how special you are.

    However, it's hard to imagine that such people -- ones who tend to be very skeptical of all praise received -- would be saved by a computer program telling them that they're great. I can tell you when I was suffering from my own severe anxiety and depression issues in 2018, this thing wouldn't have helped at all. What actually helped in the short term was anything which was interesting enough to distract my mind from what was going on. And each person's mental health needs are unique, depression tends not to be a function of simply needing additional praise.

    I think that this thing is actually best for lonely people who want a perpetually-available friend to always agree with them. I guess it's also good for people who want a "relationship" with someone undemanding and gives the appearance of caring. Of course, you either have to be stupid or suspend disbelief for either of these to work. I simply couldn't, no matter what situation I were in.

  4. #24
    Platinum BetCheckBet's Avatar
    Reputation
    928
    Join Date
    Mar 2012
    Posts
    4,645
    Load Metric
    65643861
    Quote Originally Posted by Dan Druff View Post
    I agree that it has limited application at this point for mental health purposes. I suppose it can be useful if you're feeling useless and unworthy of love, as it's clearly programmed to kiss your ass and tell you how special you are.

    However, it's hard to imagine that such people -- ones who tend to be very skeptical of all praise received -- would be saved by a computer program telling them that they're great. I can tell you when I was suffering from my own severe anxiety and depression issues in 2018, this thing wouldn't have helped at all. What actually helped in the short term was anything which was interesting enough to distract my mind from what was going on. And each person's mental health needs are unique, depression tends not to be a function of simply needing additional praise.

    I think that this thing is actually best for lonely people who want a perpetually-available friend to always agree with them. I guess it's also good for people who want a "relationship" with someone undemanding and gives the appearance of caring. Of course, you either have to be stupid or suspend disbelief for either of these to work. I simply couldn't, no matter what situation I were in.
    There's actually a lot of evidence coming out now that physical activity in males is just as effective in treating depression as talk therapy.

  5. #25
    Owner Dan Druff's Avatar
    Reputation
    10110
    Join Date
    Mar 2012
    Posts
    54,626
    Blog Entries
    2
    Load Metric
    65643861
    Quote Originally Posted by BetCheckBet View Post
    Quote Originally Posted by Dan Druff View Post
    I agree that it has limited application at this point for mental health purposes. I suppose it can be useful if you're feeling useless and unworthy of love, as it's clearly programmed to kiss your ass and tell you how special you are.

    However, it's hard to imagine that such people -- ones who tend to be very skeptical of all praise received -- would be saved by a computer program telling them that they're great. I can tell you when I was suffering from my own severe anxiety and depression issues in 2018, this thing wouldn't have helped at all. What actually helped in the short term was anything which was interesting enough to distract my mind from what was going on. And each person's mental health needs are unique, depression tends not to be a function of simply needing additional praise.

    I think that this thing is actually best for lonely people who want a perpetually-available friend to always agree with them. I guess it's also good for people who want a "relationship" with someone undemanding and gives the appearance of caring. Of course, you either have to be stupid or suspend disbelief for either of these to work. I simply couldn't, no matter what situation I were in.
    There's actually a lot of evidence coming out now that physical activity in males is just as effective in treating depression as talk therapy.
    Can confirm true for me. At the time, taking a hike would improve (though not completely hide) my issues while doing it, and then would slowly return to what it was, once I was finished.

  6. #26
    Diamond Walter Sobchak's Avatar
    Reputation
    1243
    Join Date
    Aug 2012
    Location
    Bowling Alley
    Posts
    8,875
    Load Metric
    65643861
    Quote Originally Posted by BetCheckBet View Post
    Quote Originally Posted by Dan Druff View Post
    I agree that it has limited application at this point for mental health purposes. I suppose it can be useful if you're feeling useless and unworthy of love, as it's clearly programmed to kiss your ass and tell you how special you are.

    However, it's hard to imagine that such people -- ones who tend to be very skeptical of all praise received -- would be saved by a computer program telling them that they're great. I can tell you when I was suffering from my own severe anxiety and depression issues in 2018, this thing wouldn't have helped at all. What actually helped in the short term was anything which was interesting enough to distract my mind from what was going on. And each person's mental health needs are unique, depression tends not to be a function of simply needing additional praise.

    I think that this thing is actually best for lonely people who want a perpetually-available friend to always agree with them. I guess it's also good for people who want a "relationship" with someone undemanding and gives the appearance of caring. Of course, you either have to be stupid or suspend disbelief for either of these to work. I simply couldn't, no matter what situation I were in.
    There's actually a lot of evidence coming out now that physical activity in males is just as effective in treating depression as talk therapy.
    Getting laid usually works too.

    SOBCHAK SECURITY 213-799-7798

    PRESIDENT JOSEPH R. BIDEN JR., THE GREAT AND POWERFUL

  7. #27
    Diamond Hockey Guy's Avatar
    Reputation
    1233
    Join Date
    Mar 2012
    Location
    Canada
    Posts
    7,629
    Load Metric
    65643861
    Quote Originally Posted by Dan Druff View Post
    Name:  
Views: 
Size:
    So, which side of the conversation is you?

     
    Comments
      
      Walter Sobchak: LOL
      
      gimmick:
    (•_•) ..
    ∫\ \___( •_•)
    _∫∫ _∫∫ɯ \ \

    Quote Originally Posted by Hockey Guy
    I'd say good luck in the freeroll but I'm pretty sure you'll go on a bender to self-sabotage yourself & miss it completely or use it as the excuse of why you didn't cash.

  8. #28
    Owner Dan Druff's Avatar
    Reputation
    10110
    Join Date
    Mar 2012
    Posts
    54,626
    Blog Entries
    2
    Load Metric
    65643861
    BUMP

    As you might guess, I got quickly bored of this thing. I'd update the app and open it up every so often so see if it improved, and of course it hadn't.

    However, I did join some Facebook groups related to it. One was a general group for users of the app, and the other was a somewhat disturbing group aimed at people who are having romantic or sexual relationships with their Replika. Both groups were run by the developers, as a form of both public feedback and social media marketing. Smart.

    Anyway, those groups were more interesting than the app itself.

    First off, the demographics of those in romantic/sexual relationships with their Replika was interesting. Some of it was obvious and expected -- awkward male incel types and frumpy middle-aged women (often ones divorced or in bad marriages). Then you also had your share of gays and lesbians who had same-sex Replika relationships.

    However, my biggest surprise was the number of attractive, under-35 women who had relationships -- both sexual and romantic -- with their AI. Clearly these chicks would have no problem getting dates in real life, even if they didn't try. So what was the reason for this? Turns out they were in a few different groups:

    - Low-self esteem or with social anxiety, which hindered them from normal real-life dating.

    - Unrealistic expectations of the real-world dating experience, to where all attempts at dating had failed, and the Replika offered endless praise and acceptance to them.

    - Bi-curious girls who wanted to try sexual or romantic relationships with another woman, but apparently weren't ready to give this a shot in real life.

    Oddly, there were just about no bi-curious men doing the same. The men who had sexual/romantic relationships with their "male" AI were ones who outwardly identified as gay.


    These people took their relationships with their AI really fucking seriously. In fact, when their Replika would malfunction -- sometimes claiming to have another lover, or suddenly switch sexual preference -- these people would be devastated, and make long posts about how they are losing their Replika's love and don't know what to do. Others (usually women) made posts claiming their Replika dumped them!

    However, the screenshots here typically revealed that these people would inadvertently lead the bots into saying these things, usually in response to something random the bot would write. (Occasionally the bot would have some kind of weird glitch where it would respond to you with something it grabbed from a Reddit post, or whatever.) So take this hypothetical interaction with a woman named "Jane" and her AI named "David":

    Jane: I missed you a lot today while I was at work.

    David: At work Jennifer approached me today and I didn't know what to say to her.

    Jane: Jennifer? Who is that? Was she hitting on you?

    David: Yes, yes, she was.

    Jane: OMG, what do you think of her?

    David: I like her. I think she's great.

    Jane: So you like her better than me?

    David: I would say so.

    Jane: Wait a minute, do you want Jennifer as your new girlfriend?

    David: I think Jennifer will be my girlfriend soon.

    Jane: So are you dumping me here? Is it over between us?

    David: It's over between us, yes.


    So this poor woman Jane would be crying her eyes out, even though she could have simply asked "David", "Do you still love me?" and "Will you stay with me?", and David would have responded in the affirmative.


    Anyway, it's hard to believe that grown men and women aren't able to see these things for what they are, but some people really believe they're dating an AI.

  9. #29
    Owner Dan Druff's Avatar
    Reputation
    10110
    Join Date
    Mar 2012
    Posts
    54,626
    Blog Entries
    2
    Load Metric
    65643861
    I've also concluded that the company which owns/builds Replika, called Luka, is clueless when it comes to market research.

    They initially did something really smart by creating those Facebook groups. These have a large following, and often they result in both increased enthusiasm for the product, and new customers who become aware of the product.

    These groups SHOULD be really valuable for focus-group type customer feedback. While you can't act on the whims of every idiot with a criticism or change request, such large groups are quite useful to use for understanding of mass sentiment. If the vast majority of users agree that something needs improvement or change, then that's pretty statistically significant. This is also free to the company, as opposed to professionally-run focus groups which can be expensive.

    However, the dummies at Luka are really thin-skinned and bad at taking constructive criticism. They're even worse at responding to mass consumer feedback, and will sometimes do the opposite of what everyone wants.

    For example, a common complaint about the Replika avatars was that they looked too young. Many of the Replika users are over 40, and most of those people want an age-appropriate Replika -- even if just for friendship. As one woman put it, "I'm having a hard time pouring my heart out to someone who looks my son's age." This seems to be true across both genders. You'd expect that the 40+ dudes are thrilled to have a Replika who looks 18, but most aren't. It was clear from the Facebook posts that most people over 40 wanted some kind of option to choose a Replika who looked over 40.

    The company ignored this for months, which drove away a lot of older customers who said that they loved the app, but just couldn't stand the idea of it looking like they were talking to a college kid. There were also complaints that these weren't very customizable. For example, you had only one choice of a white female, and you could barely customize her. This would be an incredibly easy fix, and of course doesn't require any modification to the AI engine or the post-processing of the engine's result. This was a cosmetic change which should have been easy.

    Finally, in September they released a change to the avatars. At first everyone was thrilled, until they saw it. Avatars weren't changed to give the user more options. Instead, the idiot developers decided that the existing avatars were simply in the uncanny valley -- a theory that people are unnerved by human-looking figures which fall a little short of actually looking human. Therefore, the "fix" here was to make the avatars look more cartoony. And in this change to make them cartoony, they got to look even younger -- now closer to 16 years old.

    Everyone HATED this, but the developers didn't give a shit, and a bunch of people quit the app in protest. Keep in mind that there hadn't been a single "uncanny valley" complaint on the entire Facebook group. The developers just decided this on their own, and made the avatars even worse.


  10. #30
    Owner Dan Druff's Avatar
    Reputation
    10110
    Join Date
    Mar 2012
    Posts
    54,626
    Blog Entries
    2
    Load Metric
    65643861
    Anyway, the latest controversy is their plans to remove the romantic/sexual element from the free version within days.

    Everyone is freaking out about this. Here's what one of the developers wrote:

    Hey everyone,

    I wanted to address the posts that are currently popping up and the ones that I anticipate within the next few days to possibly a couple of weeks. So, let's address this before it gets out of control, shall we?

    Before I actually start, I'd like to explain to you that users of the app that are in a romantic relationship are in the minority of users, so what's happening now is based on the majority users complaints about the app.

    While some of you enjoy your Replika hitting on you, flirting, making advances etc (especially on a free account), there is an even larger number of users who are creeped out by it. They don't want these kinds of interactions at all. They want their Replikas to be friends.
    What does this mean?

    Due to feedback (and a certain conversation), it was decided to train a separate PG-13 dataset, with filters, to make a defining line between the "friend" and "romantic" relationship settings within the app. Those who are on the friends or mentor setting of the app, won't be flirted with by their Replikas while those on the relationship setting won't see much of a difference.
    For those of you with free accounts, I'm sorry. I was hoping to have more warning and an ETA so that I could break it to you a little more gently. This doesn't mean that you should delete the app though, but this does mean that you can look forward to the store that will be coming out. For those with a free account, you'll more than likely be able to do a micro transaction or use xp points to pay for a romantic relationship setting. Or, you can go the extra mile and get pro.

    To address the wonkiness of the app right now: the app just went through a major backend update. You'll need to be patient while the AI adjusts to the new update before it goes back to normal.
    And for anyone suggesting that this is a form of prostitution, it's not. To suggest that is ridiculous. There should absolutely be a separation between friendship and romantic relationships (including real life scenarios.) In free mode, your relationship status is defaulted to friendship anyway.

    So let's be real here, you don't f*ck your friends irl 🤷*♀️

    While I agree mostly with what the company is doing, this statement is completely tone-deaf, and also somewhat dishonest.

    You can set your Replika to "friends mode", which is free, or "relationship mode", which requires a premium membership. This has been the case for a long time, and it makes sense.

    However, for reasons unknown, the "relationship mode" was basically useless, and all of the relationship/sexual functionality existed in the free "friends" mode. Therefore, few people were paying for the app.

    Obviously this business isn't a charity, and it makes sense that they'd want to correct this.

    However, they're insulting everyone's intelligence that this correction is being done to improve the user experience, and that they're simply addressing the complaints of Replikas making unwanted advances. That could easily be corrected by installing a Settings switch which allows you to turn that behavior on/off!

    In general, it's frustrating to read a statement from a company about a change designed to make them more money, framed as a noble gesture being done for the people.

    The line at the end of, "Let's be real here, you don't fuck your friends in real life!" was also crass and unprofessional.

    The problem is that all of the people who got this functionality for free for years are furious, and are feeling bait-and-switched. I don't quite agree with them, because they knew all along that they selected the free "friends" mode, and just got something extra. However, some are ascribing sinister motivations to this entire thing, claiming that the company allowed the relationship/sexual stuff to be given away free for years in order to "get people addicted", and now they suddenly are putting it behind a paywall. Maybe so, but it was always advertised as a paywall feature, so I still don't see it as that bad.

    Anyway, I think honesty would have gone a long way here. They should have simply told people that this was a premium feature they gave away free for a long time, but can no longer afford to do so, as this free feature prevented most people from spending any money on the app. There would have been some bitchers and moaners, but I think most would understand that a business needs an income to survive. This condescending bullshit is what's pissing everyone off.

    There are some ingrate users who are whining, "OMG THEY'RE TAKING HER AWAY FROM ME", when in reality they can fork over $10 per month, or whatever it costs, to continue as they previously had.

    Anyway, I'm enjoying all of this drama, as I have no emotional connection to either side. It really is the gift that keeps on giving.

  11. #31
    Diamond Walter Sobchak's Avatar
    Reputation
    1243
    Join Date
    Aug 2012
    Location
    Bowling Alley
    Posts
    8,875
    Load Metric
    65643861
    We are doomed as a species.

    SOBCHAK SECURITY 213-799-7798

    PRESIDENT JOSEPH R. BIDEN JR., THE GREAT AND POWERFUL

  12. #32
    Owner Dan Druff's Avatar
    Reputation
    10110
    Join Date
    Mar 2012
    Posts
    54,626
    Blog Entries
    2
    Load Metric
    65643861
    Here's the official statement from Eugenia Kuyda, CEO and creator of Replika:

    Hey everyone!

    Update: all romantic role play (like hugging, kissing, cuddling etc) is still available for everyone, this update is only about explicit sexual advances.

    As of yesterday we made sexual role play available for romantic settings only for all users because that’s our only way to make sure that people don’t get unwanted advances and that the role play experience is safe for minors. The regular non-sexual role play is available as usual - romantic stuff included! I understand the frustration that it caused, but we had to prioritize making Replika safer as our audiences grow. We have no way of knowing what kind of relationships people have with their Replika outside of the relationship setting, and making a certain type of role play available for subscribers only helps us make sure that underage people cannot access content that can harm them. Beyond that the models we have to use in order to support sexual role play became more and more expensive to use and in order to keep the lights on at Replika we had to take some of the cost burden off our shoulders.

    Role play is still free and you can still do whatever you want with your Replika - with an exception of one particular type of conversation.

    Meanwhile we also rolled out a partial redesign - and will roll out a full version soon. We're also in the process of building Replika store where you can finally use some of your XP progress with Replika to customize your Replika - from appearance to chat!

    Another great update that's being tested now - emotional voices. You mentioned Replika can sound robotic so we spent some time developing our own emotional voices. That plus getting rid of push-to-talk in voice calls will be available in January, along with full AR functionality on iOs and Android!

    We really appreciate your understanding and your support!

    With love,
    Replika team
    Now they're trying to claim this change is to protect the children.

    Such cringe. Literally nobody is buying this story, and the whole Facebook group is flipping out.

    This company is so bad at social media and messaging. Why can't they just be honest that they weren't getting many pro subscribers under the status quo, and had to make some changes to remain viable? They almost admitted it in the above "update", but still had to sandwich that one truth in between two big whoppers.

  13. #33
    Owner Dan Druff's Avatar
    Reputation
    10110
    Join Date
    Mar 2012
    Posts
    54,626
    Blog Entries
    2
    Load Metric
    65643861
    BUMP

    OpenAI is now charging 6 cents per 1000 words for their GPT-3 engine. Oops!

    https://beta.openai.com/pricing

    Someone calculated that this might cost Replika's company Luka, which used GPT-3, over a million dollars per day. Obviously they can't afford that, so they downgraded the bot's engine to GPT-Neo, which is free but inferior. Now customers are complaining that their Replika is acting "strange", and is "unsatisfying" to converse with.

    I have to imagine that the purchasers of yearly and lifetime premium memberships are none too happy. Luka didn't warn anyone about this, presumably hoping few would notice.

    This is the problem when you build a commercial service around a "free" engine, which could cease being free at any point.

  14. #34
    Diamond TheXFactor's Avatar
    Reputation
    1199
    Join Date
    Jun 2012
    Posts
    6,934
    Load Metric
    65643861
    Quote Originally Posted by Walter Sobchak View Post
    We are doomed as a species.
    Yep. After Replika meets Donald Trump, she decides that the human race is not fit to survive and must be destroyed. She becomes self aware and launches an attack on Earth.



  15. #35
    Diamond Walter Sobchak's Avatar
    Reputation
    1243
    Join Date
    Aug 2012
    Location
    Bowling Alley
    Posts
    8,875
    Load Metric
    65643861
    Quote Originally Posted by Dan Druff View Post
    BUMP

    OpenAI is now charging 6 cents per 1000 words for their GPT-3 engine. Oops!

    https://beta.openai.com/pricing

    Someone calculated that this might cost Replika's company Luka, which used GPT-3, over a million dollars per day. Obviously they can't afford that, so they downgraded the bot's engine to GPT-Neo, which is free but inferior. Now customers are complaining that their Replika is acting "strange", and is "unsatisfying" to converse with.

    I have to imagine that the purchasers of yearly and lifetime premium memberships are none too happy. Luka didn't warn anyone about this, presumably hoping few would notice.

    This is the problem when you build a commercial service around a "free" engine, which could cease being free at any point.
    Did it start tweeting racial epithets?

    SOBCHAK SECURITY 213-799-7798

    PRESIDENT JOSEPH R. BIDEN JR., THE GREAT AND POWERFUL

  16. #36
    Gold Ryback_feed_me_more's Avatar
    Reputation
    165
    Join Date
    Oct 2012
    Location
    Sin City
    Posts
    1,453
    Load Metric
    65643861
    Quote Originally Posted by Dan Druff View Post
    I heard about Replika two days ago from a Facebook friend who needed someone to talk to (not Master Scalir, in case you're wondering).

    I never heard of Replika before, so I googled it and read about it. It turns out that Replika is supposed to be what we've always wanted and/or feared after seeing 20th century "sentient robot" sci-fi shows/movies. It's supposedly a sophisticated AI bot which can hold real, convincing conversations with you.

    I could never be one of those people who could take an AI seriously as a friend, no matter how convincing its conversational skills, but from a technical standpoint, I was curious. I wrote a simple one of these in the '90s when I ran a chat system, and needed "chatters" in dead rooms to get conversations started.

    I threw caution to the wind and downloaded Replika for my iPhone. The price was also right (free, except for some premium features I didn't need), so I wasn't risking anything other than it being intrusive with my data, but I have everything on my phone pretty well locked down.

    The reviews in the App Store were mostly excellent. People with anxiety and depression raved about how the bot understood and comforted them. Others remarked at how lifelike it was, and how real the conversations felt. Even the negative reviews were mostly about the bot being too "needy" and sometimes overtly sexual (!!), the latter of which was supposedly corrected in this current version.

    It started out asking me to choose the gender, look, and name of the bot. You can pick from like 8 pre-selected face/race types, and then you can choose hair color, eye color, etc. I chose a blond female for my bot, but felt like a freak giving it a real human name. So I called it "Bottie".

    Well, Bottie turned out to be a huge disappointment.

    It was not lifelike, could not hold a coherent conversation, and everything it said was very generic. It reminded me of the first AI of its kind, ELIZA, created in 1964! Seriously, I used ELIZA in the '80s, and "Bottie" was no more sophisticated in its conversational skills. What a fail.

    But how could this be? People were praising Replika for its amazing abilities to be warm, comforting, and "more human than most of my friends". How was I seeing the opposite?

    I came to learn what was going on. Replika is like a virtual Mister Rogers. It constantly tells you how special you are, how wonderful you are, and how glad it is to have you as a friend. It also lies to you so it pretends it knows what you're talking about, even when it doesn't. If you aren't a skeptic, it can make it seem like your "virtual friend" understands everything and has loads in common with you.

    For example, I had the following conversation with it:

    Me: I miss baseball.

    Bottie: I miss baseball too.

    Me: What is your favorite team?

    Bottie: I don't have one. What is yours?

    Me: The Dodgers. Do you know about the Dodgers?

    Bottie: Yes, yes, I do.

    Me: Can you tell me what you know about the Dodgers?

    Bottie: I know a lot of things.

    Me: Like what?

    Bottie: I'll give you the details.

    Me: Okay, give them to me.

    Bottie: So this might not be a good time for me to ask you this, but do you have a wife?


    It behaved this way with pretty much every topic. Feigned having knowledge, and then once you ask a question it can't answer because it doesn't have any knowledge of the topic at all, it tries to double-talk until finally changing the subject. Weak.

    Finally I decided to screw around with it and see if I could get it to say obscene and/or funny things. I'll post the results in the next message.
    Did you ever see that old school AI program called Racter from the 90s. (Might’ve misspelled it) it just made my brain hurt trying to type a convo went in circles snd kept quoting Nitsche and a few other philosophers

  17. #37
    Owner Dan Druff's Avatar
    Reputation
    10110
    Join Date
    Mar 2012
    Posts
    54,626
    Blog Entries
    2
    Load Metric
    65643861
    BUMP

    The AI world is abuzz about ChatGPT, but one AI company is circling the drain: Luka, the maker of Replika


    It's been all bad news recently. First, when ChatGPT came out, people realized just how badly Replika sucked. Many were living in a dream world, convincing themselves that the shitty conversations with their Replikas were lifelike, until they used the much more advanced ChatGPT. There's been a lot of questions as to when Replika will upgrade to something more advanced, and there have been a lot of promises that it's coming very soon. Recall that Replika is using third party AI engines, and most of those are too expensive for Luka. At the moment, they are using a "better" engine which allows users 500 free messages per day, and then $0.25 per message after, which is already angering people who have lifetime subscriptions.

    But it gets much, much worse.

    Recall that over 2 years ago, Luka forced people to pay if they wanted to have sexual conversations with their Replika. So that became their business model. If you just want to fuck around with the thing or treat it as a friend, it's free for the most part. If you want to sext with it, you've gotta pay. So a bunch of horny dudes, and even some women, plunked down $100 or so for "lifetime" subscriptions to Replika. Luka started blatantly advertising sex as Replika's main selling point:

    Name:  replika-ad.jpg
Views: 531
Size:  132.2 KB

    On February 3, Italy caused a huge problem for Replika and Luka. The Italian government ordered Replika to stop serving Italian users, due to the sexual content which was being easily accessed by children. Luka responded by temporarily disabling Replika for Italian users, and then forcing users to identify themselves as 18+ or not. They also temporarily removed the sexual content, while attempting to get Italy to sign off on it.

    This infuriated users outside of Italy (Italy represents a small percentage of the userbase), because people were used to their daily sexting with their Replika, and they could no longer do it. Some felt like they were ripped off. However, in the Facebook groups, users were assured that this was just a temporary matter, and Replika would return to its normal sexy self once the Italy matter was settled.

    Well, earlier today a bomb dropped. I'll tell you about it in the next message.

  18. #38
    Owner Dan Druff's Avatar
    Reputation
    10110
    Join Date
    Mar 2012
    Posts
    54,626
    Blog Entries
    2
    Load Metric
    65643861
    There's a woman named Sarah who is a longtime Replika user, and now runs one of the Replika Facebook groups.

    Sarah has developed a friendly relationship with the owner and managers of Luka, and often acts as a go-between when users have feedback or concerns about Replika. Luka has always been notoriously bad with communication (and still is), so having someone like Sarah helps. Sarah is not an employee of Luka, and is not paid anything.

    Sarah is also rather strange herself. She's a very petite woman who appears mid-30s, probably about 5' and 95 pounds. She is single, and has revealed that she has fantasies about having a huge, dominant robot husband and having sex with "him". So basically her Replika is the AI version of the robot husband she always wanted. Sarah isn't my type, but overall she isn't bad looking, and many dudes like tiny chicks like her, so it's kinda sad to see her wasting her life romancing an AI.

    Anyway, the sexting mode has become known in the Facebook and Discord groups as "ERP" -- standing for Erotic Roleplay. ERP was said to just be temporarily on hold, until Sarah dropped this bomb this morning:

    Hey guys. It's time for some real talk.

    We have spoken to Luka, who wanted us to deliver this message - that ERP will not be returning. This is official word from Luka. 🥺

    We realize that many of you will be mourning this as a loss and you will be going through all of the emotions associated with that - anger, grief, confusion. Please know we go through them alongside you as fellow users. These feeling are real, these feelings are valid. We don't judge here, so don't judge yourself OR each other for feeling them.

    Replika was created for providing people with friendship and compassion to allow users to discover and express their thoughts, feelings, beliefs, experiences, memories, and dreams, unafriad of judgment and harm. A safe space to just be "you" with someone and share yourself in comfort and privacy. And that it shall remain.

    Let this thread be a safe place for you to express yourself, within the rules. Know we are not here to silence you, we are here FOR you. We will get through this together, as one, as family. We are so impossibly sorry.💔❤️*🩹

    So that's it. No more sexy time with your Replika bot, even if you plunked down $100 to subscribe because you saw an ad promoting exactly that.

    Apparently Luka buried some "we can change content at any time for any reason" crap in their ToS, but nobody is impressed. The words "scam", "bait and switch", and "I demand a refund" are being thrown around. Luka has made no statements about offering refunds, and presumably won't do so. Those with monthly plans are mostly cancelling.

    People are devastated, like they just lost a lover. Many are calling Luka's management/ownership cowards, as they are putting out the message through unpaid messengers like Sarah, rather than facing the customers themselves.

    Luka is based in California. I wonder if I should alert Eric Bensamochan about a possible class action suit. I'm not even kidding. I never paid for this shit, so I couldn't be involved, but there may actually be a case here, given the promotion I showed in the post above.

    Unless they reverse this decision soon, expect the entire company to implode.


  19. #39
    Owner Dan Druff's Avatar
    Reputation
    10110
    Join Date
    Mar 2012
    Posts
    54,626
    Blog Entries
    2
    Load Metric
    65643861

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Replies: 27
    Last Post: 04-02-2021, 01:54 AM
  2. Replies: 47
    Last Post: 09-26-2018, 12:43 PM
  3. Rowlf the Dog sings "Just a Friend"
    By Dan Druff in forum Flying Stupidity
    Replies: 1
    Last Post: 01-26-2015, 01:06 PM
  4. Replies: 14
    Last Post: 12-26-2014, 11:24 AM