cross-posted from: https://kbin.projectsegfau.lt/m/tech@kbin.social/t/26889

Google just announced that all RCS conversations in Messages are now fully end-to-end encrypted, even in group chats. RCS stands for Rich Communication Services and is replacing traditional text and picture messaging, providing you with more dynamic and secure features. With RCS enabled, you can share high-res photos and videos, see typing indicators for your…

  • Kbin_space_program@kbin.social
    link
    fedilink
    arrow-up
    124
    arrow-down
    22
    ·
    1 year ago

    Fun fact, a group I knew in uni made an end to end encryption program that sent messages through Google more than a decade ago and Google got really, really mad at them threatening to shut down all Google accounts associated with all IP addresses they used.

    Guarantee it’s not fully E2E.

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      113
      arrow-down
      3
      ·
      1 year ago

      It’s E2E, E2E isn’t really something you can be sneaky about unless you roll your own encryption and then make claims about it totally being safe bro

      They, however, run the app you are using to type everything, the keyboard you are using to type everything and the os you are using to type everything. If they want something, they don’t need to look at your in flight messages.

      • The Hobbyist@lemmy.zip
        link
        fedilink
        English
        arrow-up
        39
        arrow-down
        5
        ·
        1 year ago

        The trust doesn’t even have to be in the encryption, they could very well use the same signal protocol. They would only need a copy of the keys you are using and you wouldn’t even know… That’s the problem with closed source programs, there is no certainty that its not happening (and I’m not saying it is, I can’t prove it, obviously, but the doubt remains, we need to trust these companies not to screw us over and they don’t really have the best track record in that…)

        • Carighan Maconar@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          35
          ·
          1 year ago

          As if you’re any more comfortable with open source software, actively vetting the code, building it yourself, running your own server.

          For all you know, Signal keeps a copy of your keys, too. And happily decrypts everything you send and sells it to russian data brokers for re-sale to advertisers.

          • The Hobbyist@lemmy.zip
            link
            fedilink
            English
            arrow-up
            41
            arrow-down
            1
            ·
            1 year ago

            There is a post gathering all security audits performed on Signal messenger:

            https://community.signalusers.org/t/overview-of-third-party-security-audits/13243

            And anybody can double check it, because it’s open source. And not only is it open source, but they have reproducible builds which mean you can verify that the apk you download is the same version as is hosted on github. They also have server code published. Pretty rare. Additionally experts in the field themselves endorse signal.

            Your point is valid for many projects, as open source is not a guarantee for security. But signal is a pretty bad example for that.

            • umami_wasabi@lemmy.ml
              link
              fedilink
              English
              arrow-up
              6
              ·
              1 year ago

              Signal had their server code published? I thought they closed sorced that. I even didn’t notice.

            • Carighan Maconar@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              6
              ·
              1 year ago

              But that’s kinda my point, you rely inherently on someone else doing what open source allows you to do. So in the end you can be tricked just the same.

              I mean of course, Signal is a pretty clearcut case, but even with that one you - and I’m guessing here but tell me it ain’t true 😅 - probably do not actively verify things. You did not check the source code. You did not build your own APK to install it. I don’t think you can build the desktop version yourself but I ain’t entirely sure, granted. You probably did not probe the network data to see whether the built APK actually does what the source code promises it’ll do or has been swapped out for one that allows the server they’re running to log all messages sent.

              And so on.

              My point was entirely that even in the easiest of cases where we could do all of that, we do not actually do it. Hence the point of being able to do that is usually extremely moot.

              And I say this as someone who, at work, checks external libraries we’re using, which is an insanely time-consuming job that entirely explains why no one in their right mind does this without being paid for it, that is, in their spare time for private use.

              • xthexder@l.sw0.com
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                1
                ·
                1 year ago

                If you can’t trust peer review from experts in a field, many aspects of society break down. For example:

                • How can we trust the word of an engineer that says a bridge is safe? Did you verify the calculations yourself? Have you personally tested the tensile strength of that rebar? Better to just avoid bridges to be safe.
                • How can we trust the word of a doctor when they prescribe something? Did you personally look up all the possible side effects and made sure you’ll be safe? Do you research clinical trials yourself to verify efficacy? If you don’t trust your doctor, you’ll be right at home with the anti-vaxxers.
                • How can you trust a lawyer to argue your best case? There’s thousands of pages of law that most people haven’t read. Do you know for yourself that there isn’t some past precedent that completely flips your case? Defending yourself is a bad idea for a lot of reasons.

                Nobody can be an expert in every field. It’s completely unfeasible for most people to verify source code themselves, but that doesn’t mean open source doesn’t matter. Society operates on a degree of trust in our fellow humans that ARE experts in their field. The more experts in agreement the better, since nobody is infallible.

                I’m not sure what you’re suggesting people do? Go live in a hole by themselves because the world is full of liars and deceivers? Or become superhuman and hand verify every possible thing that could negatively effect them?

                • Carighan Maconar@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  2
                  ·
                  1 year ago

                  No of course not. I’m sorry if I’m expressing this badly, my point was merely that open source tends to add a false sense of security for people. The relevant ability to verify is factually never used, and experts that review the code might as well have had access to it without it being open sources (see Whatsapp’s audit a while back).

                  That is not to say that Open Source is not a good thing, don’t get me wrong. But I feel we tend to massively overstate what it adds for us personally. We put too much value on that side of it, as if it automatically means every user has personally verified everything.

                  • xthexder@l.sw0.com
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    ·
                    1 year ago

                    That’s a fair statement to say open source on it’s own doesn’t add any security. I will say that any developer who’s intentionally adding vulnerabilities to their code is less likely to publish the source, simply because someone COULD see it. With the number of automated vulnerability scanners on Github, it would require a lot of extra work to go undetected, when simply going closed source is an option. Once again, the more open the better, since there’s fewer places to hide things.

                  • 9tr6gyp3@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    ·
                    1 year ago

                    It doesn’t add a false sense of security at all. It forces the devs to put their name on the line with every pull request. They are publicly accountable for any and all code that is added to the product, in an open and transparent repository. If they try any shenanigans, and if anyone catches them, the developer, the project, and the community will all suffer.

                    It also gives the community a chance to fork the code and remove the problem. You can carefully rebuild with a new dev or new team.

                    Using open source software is a real sense of security, because it was built for you, not for money.

              • The Hobbyist@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                I’m not qualified to determine personally the situation of signal, or any other app. But I don’t need to. There are several experts who are and the fact that multiple of them have analyzed and evaluated an app as signal should give us a lot of confidence in their conclusions.

                We need to trust experts, and I don’t mean individual experts, but the experts as a whole, especially when they verify each other’s work. This is what it’s about. You can’t do everything yourself, you got to trust some form of collective.

      • Rooki@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        3
        ·
        1 year ago

        They can… everything is closed there. It can just be “encrypted” for your eyes

      • GigglyBobble@kbin.social
        link
        fedilink
        arrow-up
        8
        arrow-down
        2
        ·
        1 year ago

        It’s E2E, E2E isn’t really something you can be sneaky about unless you roll your own encryption and then make claims about it totally being safe bro

        With a closed source app? Of course you can. How is anyone supposed to know what keys you use for encryption? Doesn’t even need to be a remote one - just the key generation be reproducible by the developer.

        • Not_Alec_Baldwin@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          I don’t know if you’re understanding that that’s his point.

          If Google can reproduce the key it’s not fully “end to end” unless one of the "end"s is Google.

      • arthurpizza@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 year ago

        I know they have unencrypted versions from my phone because my tablet and desktop version of messages seamlessly connects to the chat. So it’s probably be E2E in transit alone.

    • pjhenry1216@kbin.social
      link
      fedilink
      arrow-up
      23
      arrow-down
      1
      ·
      1 year ago

      Sent messages “through Google”? Like Chat? Email? That’s such an ambiguous statement.

      E2EE has been a available approaching three years now. I’d imagine if they were lying and defrauding the population, someone would have found out by now. This announcement is just that it’s on by default for everyone.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      1 year ago

      It doesn’t matter if it’s E2E or not when Google can spy on you directly on the phones at either end.

      • Puzzle_Sluts_4Ever@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        Pretty much

        You can use whatever chat clients you want. Multi-billion dollar companies control your OS. They don’t need to sneak in a rootkit: The OS is their rootkit.

        E2E encryption is theoretically nice in the event there is a man in the middle at the cell tower or company. It is of arguable (zero?) value in sim spoof situations. But it is better than not

        But still, don’t trust it for anything that google/apple might care about. Because transmitting and processing voice is hard. Effectively grepping a screen for such dangerous words as “terorrism” and “union” is almost zero cost. Like, I don’t expect google to give a shit about if you like hookers or buy drugs or whatever. But if you are involved in anything that could impact their bottom line…