While I am not a fan of the kind of tech reporting pubs like the New York Times does, once in a while they actually produce a story worth reading. Such was the case, recently, when they penned a piece about Australia enacting a law that empowers the authorities to compel tech giants to create ways around the encryption built into their products.
This has been a touchy subject for some time. We all know that the issue of privacy versus individual rights has been around since the beginning of modern civilization and has been run though all kinds of trials and tribulations. We think we have a handle on it, when, all of a sudden, a new scenario emerges.
Perhaps what made a big mark on this was the issue, a couple of years ago between Apple and the FBI, around potential evidence in a suspect’s locked phone. I will not go into the details because there was a plethora of coverage around this. Just search on Apple vs. the FBI if you want to know more.
Since then, we have seen a widening of the debate as to, exactly, who should be able to access private data, and under what circumstances. And, if phone manufacturers can, or should, be compelled to use their “backdoor” access capabilities to assist legal and proper recovery of such data.
There are two distinct camps here. One says that law enforcement, with adequate safeguards, should have the right to access private data that can have a bearing on criminal investigations. The other side says that this should never be allowed because of the potential for abuse.
In the Apple/FBI case, Apple claimed that they did not have the ability to access a user’s phone data. Even if they could, Apple noted that such a move had the potential to compromise millions of other users’ phones.
I called nonsense on that then, and I still do today. I do not know of any chip or device manufacturers who do not build some type of OEM access port or system into hardware for backdoor access (even Intel does this with processors). It is a valid design criterion that serves multiple purposes, from patching to upgrading. And, it will (and should) continue. But that is not the issue. The issue is, who has a right to use it and when.
Things change in the progression of the human race. To wit, the many translations of the original constitutional amendments. A classic case is the second amendment. It was NEVER intended to enable citizens to own 50 caliber machines guns mounted on jeeps. There are those who would argue that the amendments should be translated, periodically, to take into account advances in civilization and be interpreted to fit those advances. That is one of the most used arguments for the expansion of firearm ownership for better, or worse.
OK, back to technology. If that argument is considered valid, it should support that the march of technology has presented many new issues never envisioned even a couple of decades, certainly centuries, ago. One of those revolves around privacy and its effect on safety and security. Hence, the argument over the right to access private data in today’s environment.
Before I go on, I am of the position that, with adequate safeguards, law enforcement, and bona fide security agencies, should have the right to retrieve potential evidence or other critical data from electronic devices deemed related to security issues. Now, before everybody goes off on me, I reiterate, the adequate safeguards. What that means to me is that there has to be indisputable justification for such actions.
Therefore, I am glad to see a country move in that direction. The Australian government has just enacted a law that allows law enforcement authorities to compel tech-industry giants to develop methodologies to circumvent the encryption built into their products. While this applies only to Australia it has the potential to set a precedent with global impact.
Now the battle begins. Tech companies have argued for decades that unbreakable encryption is an imperative part of protecting the private communications of their customers. There is no doubt that such safeguards are necessary and warranted. But the extent to which these tech companies argue the issue is too broad.
This is no longer the era of only physical evidence. Much evidence is virtual — computers, phones, digital assistants, digital video/audio, etc. And having to struggle to obtain such evidence or data makes it difficult, or even impossible, for them to gain access to things such as online discussions of crime suspects, particularly in time-sensitive or terror investigations.
There are protections within the Australian law. For example, authorities cannot demand universal decryption capabilities or introduce system-wide weaknesses. Apple replied that it is impossible, for example, to create a workaround for one iPhone’s encryption without potentially introducing something that could work for all of them.
That is nonsense. My experts tell me it is not that difficult to develop a back door that, if properly implemented, can be unique to individual devices. Compromising one device will not create a system-wide breach potential.
Immediately, of course, the hand-wringers weighed in. Apple officials called the law “dangerously ambiguous” and “alarming.” Mike Cannon-Brookes, one of the founders of Atlassian, a business software company that is among Australia’s biggest tech companies said, “All of Australian technology is tarnished by it.” And Sarah Moran, whose Girl Geek Academy teaches young women to code in Australia said, “Why would I tell young girls to go build tech here if there’s not going to be any tech industry.” Huh? How does this law instantly dissolve the tech industry?
Australia is not the first to do this. Great Britain has something similar, but it is not as comprehensive.
For a long time now, tech companies, fearing something like this was on the horizon, argued that they cannot be compelled to create tools for breaking the encryption in their products. Their argument is based in their belief that code should be considered a form of “free speech” and protected under the First Amendment – seriously?
There are some far-reaching implications here and lots of unknowns such as to whom it will apply. For example, will it apply to anyone in the chain who touches the data such as communication providers, websites, any service that supplies or forwards data to an end user?
Initially, the thought was to target smartphones, digital assistants and social media. But the implications go much wider when one drills down.
The law has teeth, as well. Non-compliance can result in asset seizure as well as the possibility of executives being jailed for contempt if they refuse to comply.
There are myriad lower tier issues, as well. For example, what would be the bounds for disclosure with unwitting individuals around subversive or other criminal data outside of the intended participants?
While this is a slippery slope for all to tread, it is a step in the right direction. It is not fair to the innocent to tip the scales so far in the name of privacy that the nefarious elements are allowed to conduct illicit and criminal behavior knowing what they do and say cannot be uncovered. Privacy is not all inclusive! We have a right to protect the innocent by using any and all legal means to do so. Sometimes laws just have to change to keep up with the times.