Australia’s Horrible New Encryption Bill and Its Implications for TallyLab

Posted by Jordyn on December 11, 2018 under App News

Australian Coat-of-Arms

As you might have heard, Australia passed legislation called the Assistance and Access Bill 2018 (#aabill) on Friday. The bill’s aim is to enable law enforcement to access any encrypted data they feel may be relevant to a criminal investigation, and it achieves that aim by requiring companies to build into their software ways to access and decrypt users’ data (aka backdoors).

The problem with backdoors is that they potentially let everyone in, not just the people they were built for.

The #aabill also allows any employee of a company (not just leadership) to be approached by law enforcement for access to user data, and compels the employee to keep the request (and their compliance with it) a secret from their employers.

Read more on what the bill is and what its global implications might be at The Verge and Wired.

Just to be clear: We wholeheartedly disagree with this legislation.

What are the bill’s implications for TallyLab?

User data in TallyLab is end-to-end encrypted. It’s encrypted on the device, it stays encrypted in transit to be backed up, the backup itself is encrypted, and it stays encrypted when syncing data between devices and users.

TL Encryption

Data is only decryptable locally, which means only users have the keys to decrypt their data. We use their public encryption key to identify which backup belongs to a particular user, but that key is not associated in any way with a person’s identity. And the key is itself encrypted when in transit to locate a backup, so intercepting it wouldn’t reveal anything.

The only way an outside actor can get access to a specific user’s data is if they have access to the user’s device and/or access to their encryption keys.

A Few Scenarios

Australian Police Car
By ガソリンスタンド — Own work, CC BY-SA 4.0

Scenario 1: Australia comes to us asking for a specific user’s data.

Our response would be: No-can-do. Only the specific user themselves can give you access to their data. We don’t have access to their encryption keys, and without the keys we can’t even locate, much less decrypt, their data.

Why that’s good: At the very least, the agency making the request would have to interact with the person — via warrant or similar — in order to get their encryption keys, thereby alerting them that an investigation is underway.

But it also means there can be no large-scale harvesting of user data from our system. There’s simply no technical way to do so. Yes, a law enforcement officer (or hacker) could go after one user via that person’s device, but the damage stops there.

He's Making a List

Scenario 2: Australia insists we keep a list of our users identities associated with their encryption keys.

Our response would be: No. And, unfortunately, this would force us to take measures to block traffic from Australia.

Why that’s tough to do in practice: The only data we gather* that even implies location is IP address, but it’s a notoriously noisy signal and is quite easily circumvented by using a VPN or other methods.


The way the bill is currently worded, it’s unclear exactly what gives Australia jurisdiction to make encryption-busing requests of companies. Is it that the company must be located in Australia? We aren’t. Does the company need to have employees or an office in Australia? We don’t. Or does the company need to have users who are Australian? That we do.

We are very interested to see how this all plays out. But in the meantime, we’re not giving Australian authorities access to your data. Furthermore, we hope that once the full ramifications of the bill are brought home to Australian citizens, it will be repealed.

*We gather IP addresses in order to triangulate usage stats, but they are never associated with individual user identities.