Monthly Archives: December 2015

Finally have Apple Pay working again on Watch

As my faithful readers know ūüėČ , I really like the Apple Watch. I wrote in September about getting the wrist detection working again, and it’s been great since then…with one exception. Without wrist detection enabled, you can’t use Apple Pay (it won’t let you store cards without wrist detection enabled). When I turned wrist detection off, before that fix with OS 2, my Apple Pay configuration went away; that’s the way it works. This doesn’t sound so bad, but then when wrist detection was back, I couldn’t add cards to Apple Pay. They would be stuck being “activated.”

I did a fair bit of Googling on it, and it appeared to be an issue with some bit of¬†iPhone storage not getting completely cleared out with card removal. Supposedly a restore of the phone fixed things for many folks. ¬†However, I didn’t want to go to the trouble to do that, so I just ignored that one missing feature. ¬†After December’s Watch and iOS updates, I decided to try again and lo and behold, it worked! I loaded up some cards and went Christmas shopping on Friday. Woohoo! I’m a happy camper.

Encryption, backdoors and spies, oh my!

Since the Paris terrorist attack and then the San Bernardino shootings (which now are confirmed to be an internationally inspired terrorist attack by a US lifetime resident and citizen) there has been much discussion among talking heads on screen and in print about needing to be able to eavesdrop on all communications. Many pundits, candidates, and congressmen have jumped all over this bandwagon, calling for more surveillance and calling for means to access any encrypted communication. Many of these same advocates for eavesdropping are ardent supporters of the 2nd Amendment, but forget about the rights for the people to communicate, assemble, and be protected from unreasonable searches and seizures (1st and 4th Amendments). However, putting aside the legal issues and politics that are wrapped up in the issue, this is technically a very bad idea.

First, let’s consider encryption alone, without considering some sort of backdoor or key escrow. Encrypted communication has been with us as long as there has been writing, and really, it’s been with us as long as there has been spoken language. Fundamentally, it’s communication that can’t be deciphered due to some sort of obfuscation. This can manifest itself as something intelligible only to the communicating parties such as a jumble of letters or symbols, or some common words (spoken or written) to which is ascribed a common, secret meaning known only to the communicating parties. The common thread here is that there is some shared knowledge that can be used by the communicating parties to extract the hidden meaning, either a shared secret or knowledge of the location of a message. Cryptography is an old art, dating back several thousand years (see also this article for more history).¬†There are a myriad of non-digital ways to hide information, and a quick overview of the¬†Wikipedia article on Steganography¬†can be quite illuminating to the uninitiated, although computers have opened up many new avenues for the practices. Classic encryption took the form of a shared secret (a word, phrase, words on pages of a book, etc.) that could be used to encrypt and decrypt the coded message. Innovative ways of doing this, and in particular changing the shared key, made such messages very secure. In the digital world, this is called symmetric key cryptography.

Widespread use of computers has¬†created many types¬†of communications where information needs to be shared, but also protected, and this brought about the rise of public key cryptography¬†(a pair of algorithmically related keys) and digital signatures as a means of solving the shared secret¬†conundrum. A fundamental point that you should take away, however, is that in the digital world, encryption, whether with a shared secret or a public keys, it boils down to algorithms implemented in computer code. This is embedded in tools you use every day on your computer. Whenever you see a “lock” or other security symbol in the URL display of your browser, you are seeing the results of these algorithms implementing public key encryption. There are many algorithms for cryptography throughout cyberspace that are in the public domain and can be used by anyone. A quick trip to Google will show you this. Likewise, many derivative works exist that are not cataloged. Herein lies the first lesson. The US does not “own” cryptography, or algorithms, and there are many freely available algorithms and code implementations of those algorithms that are beyond the reach of US laws. Efforts to constrain or to weaken encryption will not affect those who want to hide their communications. Just as toolkits are available for the propagation of computer malware (another interesting story!), toolkits for encryption are available and will remain available regardless of legislation. To paraphrase a saying of supporters of 2nd Amendment¬†rights, “when encryption is outlawed, only outlaws will have encryption.”

We have seen this week a revelation that¬†Juniper Networks found¬†at least one “backdoor” in their router/firewall¬†operating system, that had been there since at least 2012. What was the source of this intrusion? Probably nation-state hackers. Who? ¬†Good question. Why? Putting such code in network appliances¬†gives, as the article says, the ability for the owner of the exploit to access resources behind the firewall, the ultimate target. So, how are these thoughts¬†connected? If there is a backdoor in a network¬†appliance, it can allow someone to bypass controls. The fact that this exploit went unnoticed for years speaks to the difficulty of checking for such intrusions. Likewise, if there was a backdoor engineered into an encryption system, as is promulgated¬†on various fronts today, it would be vulnerable to misuse and unauthorized access. ¬†This would impact the security and privacy of legitimate users¬†who are using the encryption system. Would it improve security? I think not, as those who are really concerned about eavesdropping on their conversations will take additional measures, such as using internationally sourced tools (or tools written by a trusted colleague), or by simply obfuscating the messages that are carried by the communication system. ¬†We have a backdoor to access the communications path, but we can’t see anything (at least prospectively) other than innocuous communication (remember steganography?). The value of a backdoor in encryption systems in preventing terrorist attacks is thus minimal, and the breakdown in the privacy of communications for others is significant.

Conversely, focusing not on the content, but rather the patterns of communication (the so-called metadata) or observing other external phenomena does have value. If someone is communicating with known terrorists or in places frequented by such individuals, that can and should raise a red flag. Then, traditional methods of surveillance can then be employed, including bypassing the encryption challenge¬†by placing “taps” (malware) on a suspect’s devices and thus viewing the decrypted messages. That still leaves the challenge of obfuscated messages, but it is more useful. Much can be learned by observations of patterns and metadata. A classic example is determining the likelihood of imminent military action by observing the number of evening pizza deliveries to the Pentagon.

In summary, encryption has been with us since the beginning of communication. Computers are tools used in encrypting messages, but have not changed the fact that those planning activities where they want secret communications have many channels available to them. Sophisticated actors will layer protections on their communications, and simple backdoors to our personal devices or encryption tools will not pierce that veil. If backdoors are in place, we run the very significant risk that those backdoors will be used by actors¬†other than the intended “official” users, and we have thus compromised the security of all and gained little or none in return.