negative zero

[Video] Oxen Ecosystem: 3. Session

2022 January 31

[crypto] [decentralization] [info] [messaging] [opinion] [oxen] [privacy] [session] [tech] [video]


Oxen Ecosystem

  1. Introduction
  2. Oxen
  3. Lokinet
  4. Session
  5. Thoughts on the Oxen Ecosystem


This is the fourth video in a 5-video series on Oxen, Lokinet, and Session. This video is about Session, an encrypted messaging app that runs on the the network of Oxen Service Nodes.

All original content in this video is dedicated to the public domain. Third-party resources included in this video are not necessarily public domain, but they fall into one of two categories:

  1. They are public domain or licensed under a license which permits commercial use without attribution.
  2. They are copyrighted works which I have not licensed but which I believe I am using in accordance with fair use.

All Formats


Sources (in order)

  1. P. Zimmermann. Why I Wrote PGP. https://www.philzimmermann.com/EN/essays/WhyIWrotePGP.html, 1999.
  2. N. Borisov. I. Goldberg. E. Brewer. Off-the-Record Communication, or, Why Not To Use PGP. https://otr.cypherpunks.ca/otr-wpes.pdf, 2004.
  3. T. Perrin, M. Marlinspike. The Double Ratchet Algorithm. https://signal.org/docs/specifications/doubleratchet/, November 2016.
  4. M. Marlinspike, T. Perrin. The X3DH Key Agreement Protocol. https://signal.org/docs/specifications/x3dh/, November 2016.
  5. P. H. O'Neill. Watch the video Edward Snowden made to teach Glenn Greenwald email encryption. https://www.dailydot.com/debug/edward-snowden-gpg-for-journalists-video-nsa-glenn-greenwald/, May 2014.
  6. M. Marlinspike. Signal Foundation. https://signal.org/signal-foundation/, February 2018.
  7. M. Marlinspike. Comment. In Please add LibreSignal to f-droid. https://github.com/LibreSignal/LibreSignal/issues/37#issuecomment-217339450, May 2016.
  8. B. Lovejoy. Alarming test shows US carriers fail to protect you against SIM-swap attacks. https://9to5mac.com/2020/01/13/sim-swap-attacks/, January 2020.
  9. M. Marlinspike. Reflections: The ecosystem is moving. https://www.signal.org/the-ecosystem-is-moving/, May 2016.
  10. larma. About backdoors in Signal and other apps. https://blogs.fsfe.org/larma/2017/signal-backdoors/, January 2017.
  11. [Wiki] Signal Android app on F-Droid store / F-Droid status. https://community.signalusers.org/t/wiki-signal-android-app-on-f-droid-store-f-droid-status/28581
  12. Use something else instead of Google reCAPTCHA. https://community.signalusers.org/t/use-something-else-instead-of-google-recaptcha/6289
  13. scottnonnenberg-signal. Comment. In Standalone device setup error. https://github.com/signalapp/Signal-Desktop/issues/5417#issuecomment-885154474
  14. A. Macleod. OTF – The "Independent" Internet Freedom Organization That Makes All Your Favorite Privacy Apps – is Staffed Full of Spies. https://www.mintpressnews.com/the-open-technology-fund-makes-privacy-apps-staffed-spies/279147/, December 2021.
  15. K. Jefferys, M. Shishmarev, S. Harman. Session: A Model for End-To-End Encrypted Conversations With Minimal Metadata Leakage. https://arxiv.org/pdf/2002.04609.pdf, March 2020.
  16. The Oxen Project. Session Protocol: Technical implementation details. https://getsession.org/session-protocol-technical-information, December 2020.
  17. The Oxen Project. Oxen Name Service for Session. https://docs.oxen.io/products-built-on-oxen/session/loki-name-system-for-session
  18. The Oxen Project. Session code audit: Technical report published by Quarkslab. https://getsession.org/session-code-audit, April 2021.
  19. Quarkslab. Oxen Session Audit. https://blog.quarkslab.com/resources/2021-05-04_audit-of-session-secure-messaging-application/20-08-Oxen-REP-v1.4.pdf, May 2021.

Script

This video assumes you're already familiar with Oxen, specifically the network of Oxen Service Nodes and the Oxen Name System, and with Lokinet, or at least onion routing. These concepts will be referenced without explanation in this video.


3.1. Background

Before we can talk about Session, we first need to explore previous methods for encrypting private conversations.


3.1.1. PGP

In 1991, Phil Zimmerman, motivated by a proposed bill which would require makers of encrypted devices to backdoor the encryption[1], published a program called Pretty Good Privacy, or PGP for short. PGP enabled users to encrypt their emails, protecting their communications from surveillance.

With PGP, each user has two keys: a private key and a public key. I'm abstracting some technical details away here, but basically, if Alice wants to send a message to Bob, she first signs the message with her own private key, then encrypts the message with Bob's public key. This enables Bob to decrypt the message with his own private key, and verify Alice's signature with her public key.


3.1.2. OTR

In 2004, Nikita Borisov, Ian Goldberg, and Eric Brewer published a paper titled Off-the-Record Communication, or, Why Not To Use PGP[2]. This paper outlines two issues with PGP's approach to encryption and authentication:

  1. Since everyone uses Bob's public key to encrypt messages for Bob, someone who gains access to Bob's private key can go back and decrypt all the encrypted messages that were ever sent to him.
  2. When Alice signs her message, anyone with a copy of the plaintext message can look at the signature and verify that it came from Alice. This property is called non-repudiation: Alice cannot deny to a third-party (like a judge) that she wrote a message because Bob can use her signature to prove that she did.

This paper proposes a system that fixes these issues, called OTR, along with a plugin for the GAIM instant messaging client (now known as Pidgin). With OTR, we have improved properties:

  1. Roughly every message, the encryption key changes. This means that if someone gets ahold of a single encryption key for one message, it cannot also be used to decrypt previous messages. This property is called forward secrecy.
  2. Rather than use a non-repudiable signature, messages are authenticated in a separate way such that when Alice sends Bob a message, Bob can be sure it was Alice who wrote it, but he can't prove it to someone else. This is called deniable authentication: Bob can authenticate Alice, but Alice can deny it to a third-party.

To help conceptualize deniable authentication, consider writing a letter to a friend. If you sign the letter, your friend will be able to show the letter to someone else, who in turn will be able to verify that you wrote the letter. If, however, you put the unsigned letter in an envelope, then sign the envelope, your friend who opens the signed envelope with an unbroken seal will be sure that you wrote the letter inside, but they won't be able to then prove it to someone else. The contents of an envelope with a broken seal could have been put there by anyone!

This is not how deniable authentication actually works, just a simple real-world example to help you understand the concept of messages whose author can be verified by the receiver but not proven to a third-party.


3.1.3. Signal

You might have heard of Signal, an encrypted messaging app that's actually popular enough that normal people use it, not just privacy nerds. The Signal Protocol, the cryptography used in the app, is based on OTR. It improves on the forward secrecy of OTR with the Double Ratchet algorithm[3], ensuring that every single message is encrypted with a different key. It also solves some usability issues with OTR. Notably, OTR required both parties to be online to establish an encrypted connection. With Signal, Alice can start an encrypted session with Bob, even if Bob is offline.[4]

Signal as an app is also very well-designed. It's extremely easy to use, with the goal of making strong encryption effortless and automatic. This should not be understated, considering end-to-end encrypted communication is often difficult and confusing – so much so that when Edward Snowden wanted to leak classified NSA documents to journalist Glenn Greenwald, he had to make a 12-minute video explaining how to use PGP, and Greenwald still couldn't figure it out.[5]

Signal sounds amazing, right? So what's wrong with it? Well, a lot, actually. Unlike PGP and OTR, which are not restricted to a specific messaging protocol and support decentralized uses, Signal is specific to a particular app and particular servers run by the Signal Foundation[6], formerly Open Whisper Systems. Users are beholden to the choices made by the Signal developers. Even though the software is free and open source, third-party clients are forbidden.[7] You can fork off and make an entire separate encrypted messaging ecosystem, but making something that talks to users of the official Signal app is a big no-no.

And Signal has made some questionable choices. For one thing, Signal users are required to sign up with their phone number and use it as their account identifier, making anonymous use of the app, as well as having multiple accounts, much more difficult. This also provides a major vector for account compromise: By default, if you want to take over someone's Signal account, you just have to take over their phone number, which is notoriously easy to do.[8]

The centralized model of the network is itself an issue. Despite the developers arguing it's a good thing, actually[9], it gives the Signal Foundation operating the servers (and possibly Amazon, the company which actually hosts the servers) disproportionate power over Signal users. You might trust the foundation now, but if, with or without outside influence, it turns malicious, decides to ban you, or shuts down altogether, your only recourse will be convincing all of your contacts to move to a different platform. It also makes Signal a desirable target: You only need to attack one organization to shut down the entire network.

There are many other issues and concerns with Signal as well, such as the use of proprietary Google services[10] and refusal to allow a fully free Android build to be distributed through F-Droid[11], the use of Google reCAPTCHA[12], the requirement to use one of the mobile apps because the desktop client doesn't work on its own[13], and the fact that from 2013-2016, Signal was funded by a CIA propaganda project[14], but in this video, we'll focus on the centralization and lack of anonymity.


3.2. Session


3.2.1. Session

...which bring us to Session. Session[15] (formerly Loki Messenger) is a fork of Signal, meaning a modified version of the Signal code, which is designed to be decentralized and anonymous. It runs on Oxen's Service Node network, from which it inherits its decentralization, and users are not required to provide any personal data to sign up. They just generate cryptographic keys and start using it. Furthermore, connections in Session are onion-routed to prevent an observer from knowing who is talking with whom.


3.2.2. Session's Crypto

The Session whitepaper specifies that the app uses the Signal Protocol, but the development team later decided to drop this in favor of a simpler protocol, citing issues with reliability, multi-account support, and account restoration when using the complex Signal Protocol in a decentralized context.[16] The simpler Session Protocol offers none of the key improvements that OTR and Signal made over PGP, namely forward secrecy and deniable authentication. The Session team argues that these properties are impractical in common use. Let's explore that a bit.


3.2.2.1. Forward Secrecy

Session argues that forward secrecy – the property preventing an attacker from compromising your old messages if they compromise your key – doesn't make sense in practice.

PFS [that's Perfect Forward Secrecy, another name for forward secrecy] specifically provides protection when the long-term keys of a device are compromised. Assuming the application properly manages keypairs, the only way this should occur is through full device access. It is this detail which limits the cases in which PFS is applicable. If an attacker has full device access, decrypted messages can be pulled directly from the Signal/Session database.

Think of it this way: If I run a bunch of Oxen Service Nodes and save copies of all of your messages, and then I track you down and steal your phone, sure, I can copy your private key and use it to decrypt those messages I saved... but I can also just open the Session app and read your messages there.

That's the argument, but it hinges on something important which I'd like to contest.

Assuming the application properly manages keypairs, the only way this should occur is through full device access.

Exporting your keys is actually a common behavior in Session. Session provides a 13-word Recovery Phrase which is used as a seed to derive the user's cryptographic keys. This recovery phrase enables users to use multiple devices at the same time or to keep their identity when they get a new device. It's no stretch of the imagination to suppose that a user might backup their recovery phrase somewhere (maybe writing it down) and this backup of just the recovery phrase might be compromised, rather than the user's device itself with the encrypted messages on it.

Moving on to the next part...

PFS would be effective in the instance of a user using disappearing messages or if manually deleting messages while an attacker was intercepting their messages at the network level. In this case, the attacker could not decrypt the messages collected at the network level by compromising the long-term key of the device. However, the usage of disappearing messages is low,

This says, if you regularly delete your messages from your device, I can't read them off your device. But since most people don't do this, Session argues, forward secrecy doesn't really matter. It seems to me that the takeaway should be "Most people don't do this, so we should encourage them to", for example by making disappearing messages the default behavior in the app. But that's me.

and sophisticated attackers that have full network and device access are likely to perform easier and more damaging attacks, like accessing current contact information, reading future messages, and compromising device information — which neither the Signal protocol nor the Session protocol are currently able to protect against.

This part just seems like misdirection to me. It says that someone in a position to do this attack would likely also do other attacks we can't protect against, and for some reason that's supposed to be a justification for not protecting against this attack even though we have the ability to do so. The mentioned attacks don't preclude the usefulness of this protection.

Suffice it to say, I'm not convinced by Session's arguments, and I'm disappointed in the choice to give up forward secrecy.


3.2.2.2. Deniability

Now, let's look at deniability. One of the goals of "off-the-record" communication is that it's, well, off-the-record: no one can provide a provable record of what was said and by whom.

Deniability in the Signal protocol is a cryptographic property which prevents a cryptographic proof from being provided that a person signed a message with a particular key. In practice, cryptographic deniability is often disregarded when it comes to court cases or media reporting. For example, cryptographic deniability was used unsuccessfully as a defense in a court case involving the communications of Chelsea Manning and Adrian Lamo. Instead, courts often rely on screenshots of conversations from a dishonest chat partner or seized devices to establish the real world identities of the chat participants.

Just so we're on the same page here, this argument says that the fact that someone did not confess to something is often disregarded in court cases and media, and the conclusion drawn from this is that you should confess to everything you say... which is ridiculous.

Now, I'm not saying that cryptographic deniability is a complete defense in and of itself, but remember that the alternative is providing proof which can only be used against you. If the goal is plausible deniability, why would you provide proof, even if you think that no one on the jury will be technical enough to understand it?

Additionally, deniability fails to provide protection if a chat partner or device is compromised during the sensitive conversation. In such a case, the compromised participant or device can prove which device or chat partner sent which messages post-conversation.

Sure. Again, the set of things we can protect against is limited, but that doesn't mean we shouldn't protect against them.

Session also points out that due to accounts being anonymous by default, deniability doesn't matter as much. (You can prove that this message was signed with this key, but who's to say that that key represents Alice?) I do think this one is a valid point.

It also says that by default, signatures are discarded after verification, so an official client should not be able to prove authorship after the fact. (This, of course, doesn't stop someone from using a modified client.) It also has plans to implement a practical, non-cryptographic deniability feature, allowing users to modify local copies of messages other people send them, thus rendering any conversation shown on someone's screen completely unreliable.

That's cool, but why not have deniability both on a cryptographic level and in the app's interface?


3.2.2.3. Conclusion

Session makes cryptographic sacrifices with dubious justification, reverting to a PGP-style crypto protocol.


3.2.3. Session and ONS

Session users are identified by their public keys, which are long, random-looking alphanumeric strings that are very inconvenient to type and verify. These are also shown as QR codes which can be more conveniently scanned to add contacts.

To make this process easier, users can register usernames with ONS, mapping their username to their public key. Session usernames do not expire.[17]


3.2.4. Session Security Audit

During 2020-2021, Session was audited by a cybersecurity research company called Quarkslab.

Session announced the results, saying:

Understandably, plenty of privacy enthusiasts, journalists, activists, and other groups have been waiting for Session to be audited before they gave it their full support. Like we said before, it's all about trust, and for things like the Secure Messaging Apps Comparison, having no code audit made it hard to trust Session — now, Session is easy to recommend.[18]

This audit[19] found some issues, most of which were either fixed or declared features. Only one was considered high severity, and that one was promptly fixed. Quarkslab gave an overall favorable conclusion:

Oxen Session really improves Signal privacy and resilience

(Note, that's privacy and resilience, not security.)

by using an overlay network to the existent end-to-end encryption instant messaging solution. The onion-routing mechanisms make use of Oxen's Snodes [that's Service Nodes] to store and exchange messages, however, there are some other centralized standard web services that are still used through the overlay network (for the push service and to deliver attachments files).

That's worth knowing. The push services (these are Apple on iOS and Google on Android) can be disabled within the apps if this is a concern for you. In they're disabled, the apps regularly poll for new messages instead.

All major concerns have quickly been fixed.

Great!

The overall security level of this application is good. With slight exceptions, all the good practices have been kept in mind when developing this product in each platform specificities as well as in the global architecture of Session.

So that's good! That's a favorable review. But let's also take a look at the scope of the audit.

Through this audit we reviewed three components that are part of Session, each evaluation was performed by one evaluator in 10 days.

Those audits were carried out sequentially in the following order:

  • the Android application;
  • the iOS application;
  • the Desktop application.

Now, this next part is very important:

Vendor [that's Oxen] performed a complete redesign of the cryptographic protocol used for encryption and signature of messages. This refactoring took place right before the third and last audit of the desktop version. 10 more days were added to evaluate this protocol.

So the versions of the Android and iOS applications that were audited used a completely different cryptographic protocol (that being Signal) than the current apps do. However, the desktop version used the new Session Protocol, and extra time was given to evaluate this new protocol.

Due to the large code base and multiple inclusions of third-party code, this analysis is not exhaustive. Yet, it is sufficient to highlight a few vulnerabilities that could be fixed.

Right, so this audit is not considered comprehensive and does not claim to have found all the security issues that might be in the apps. It's also important to remember that audits look at a snapshot in time. In any software project, when the code changes in later versions (as had already happened in a major way in the mobile apps by the time the audit came out), it might introduce new issues.

Unfortunately, continuous auditing is cost-prohibitive.


3.3. Conclusion

To recap: Session is a fork of Signal which addresses two of Signal's issues: centralization and lack of anonymity, but tosses away the cryptographic benefits of Signal in the process.


Next: Thoughts on the Oxen Ecosystem