The Privacy BlogPrivacy, Security, Cryptography, and Anonymity

CAT | Personal Privacy

IPhone with soldering iron

There is a lot of hand wringing about the announcement that the FBI, with outside help, has been able to break into Syed Farook’s iPhone. This is not at all the same situation we would have if Apple had agreed to create the FBI requested version of the operating system. The important difference is scalability.
With this announcement we now know that law enforcement can break into any iPhone (of that generation or earlier at least) given sufficient effort. That effort is the key. It appears that the phone hack requires disassembling the phone and desoldering at least one chip at a minimum. It might actually be more complicated and cumbersome.
This is absolutely not something that any government is going to do thousands of times, it can not be done quickly and would probably leave evidence of the activity. This is fine for investigations of high value cases, but is absolutely useless for mass surveillance.
Contrast that with what could happen if Apple had created the security bypass operating system. Once created it would certainly be compelled in many different cases. Governments around the world would all demand access to the tool. That tool would allow rapid software only compromise of the phones without physical modification. This kind of attack scales to large numbers much more easily. Fortunately it would still require physical access to the phone, but that could obtained in many ways both overt and covert. I suspect that the compromised OS could be delivered through a modified phone charger for example.
Doubtless many companies will be working to make their devices secure against this kind of physical attack as well as making the kind of FBI requested modification actually impossible. In the meantime, the effort required to compromise each phone ensures that only a very few phones belonging to very narrowly targeted individuals will be unlocked. I can live with that.

· · ·

IPhone lock screen iOS8

Since it was introduced, Apple has had the ability to decrypt the contents if iPhones and other iOS devices when asked to do so (with a warrant).

Apple recently announced that with iOS 8 Apple will no longer be able to do so. Predictably, there has been a roar of outrage from many in law enforcement. [[Insert my usual rant about how recent trends in technology have been massively in favor of law enforcement here]].

This is really about much more than keeping out law enforcement, and I applaud Apple for (finally) taking this step. They have realized what was for Anonymizer a foundational truth. If data is stored and available, it will get out. If Apple has the ability to decrypt phones, then the keys are available within Apple. They could be taken, compromised, compelled, or simply brute forced by opponents unknown. This is why Anonymizer has never kept data on user activity.

Only by ensuring that they can not do so can Apple provide actual security to it customers against the full range of threats, potentially least of which is US law enforcement.

Lance Cottrell is the Founder and Chief Scientist of Anonymizer. Follow me onFacebookTwitter, and Google+.

· · ·

Hide head behind laptop

In many cases, a false sense of security causes people to put themselves at much greater risk.

The following article describes a “burner” phone service that re-uses the temporary phone numbers. It appears that number a security researcher received was previously used by a sex worker, who’s customers continued to send pictures and messages to the number after it had been re-assigned.

DOH!

 

Recycled ‘burner’ number sends sex worker’s clients to security researcher | ZDNet

Lance Cottrell is the Founder and Chief Scientist of Anonymizer. Follow me on Facebook, Twitter, and Google+.

·

HiRes

The Internet is on fire with discussions of the recent release of stolen nude photos of over 100 female celebrities. This is a massive invasion of their privacy, and it says something sad about our society that there is an active market for such pictures. While this particular attack was against the famous, most of us have information in the cloud that we would like to stay secret.

While there is not a definitive explanation of the breach the current consensus is that it was probably caused by a vulnerability in Apple’s “Find My iPhone” feature. Apparently the API interface to this service did not check for multiple password failures, a standard security practice. This allowed attackers to test effectively unlimited numbers of passwords for each of the accounts they wanted to access.

Because most people use relatively weak passwords, this attack is quite effective. Once they gained access to the accounts, they could sync down photos or any other information stored in iCloud.

Of course, the first rule of secrecy is: If it does not exist, it can’t be discovered.

If you do want to create something that you would be pained to see released publicly, then make sure you keep close control of it. Store it locally, and encrypted.

Wherever you keep it, make sure it has a strong password. Advice for strong passwords has changed over time because of the increasing speed of computers. It used to be that fancy pneumonics would do the trick but now the fundamental truth is: if you can remember it, it is too weak.

This is particularly true because you need to be using completely different passwords for every website. Changing a good password in a simple obvious way for every website is obvious. It might prevent brute force attacks but if some other attack gives access to your password, the attacker will be able to easily guess your password on all other websites.

You need to be using a password manager like 1Password (Mac), LastPass, Dashlane, etc. Let the password manager generate your passwords for you. This is what a good password should look like: wL?7mpEyfpqs#kt9ZKVvR

Obviously I am never going to remember that, but I don’t try. I have one good password that I have taken the time to memorize, and it unlocks the password manager which has everything else.

UPDATE: There appears to be some question about whether this vulnerability is actually to blame.

· · ·

Unknown known

Your Anonymous Posts to Secret Aren’t Anonymous After All | Threat Level | WIRED

This article describes a clever attack against Secret, the “anonymous” secret sharing app.

Their technique allows the attacker to isolate just a single target, so any posts seen are known to be from them. The company is working on detecting and preventing this attack, but it is a hard problem.

In general, any anonymity system needs to blend the activity of a number of users so that any observed activity could have originated from any of them. For effective anonymity the number needs to be large. Just pulling from the friends in my address book who also use Secret is way too small a group.

Lance Cottrell is the Founder and Chief Scientist of Anonymizer. Follow me on Facebook, Twitter, and Google+.

· · · · ·

Older posts >>