You can ignore it, minimize it or dismiss it, but there is a problem. The systematic abuse of surveillance systems against journalists, activists and political dissidents is a documented fact. There have been numerous reports in the last years which show that the use of spyware to suppress dissent and cover corruption isn’t an isolated event. It’s a sad reality.

Very often these attacks are carried out by oppressive governments around the world with the help of tools and services provided by Western companies, who have been repeatedly discovered and exposed, but are still operating with impunity.

Regulation or not

The policy making and privacy communities identified export controls as a potentially viable solution to increase the oversight and the accountability for those who market these invasive techonologies around the world. The scheme is very simple: companies willing to sell technologies classified within a selected group outside of a given region, will have to obtain a license from their local government; the licensing authority will evaluate the risks involved in such sale and decide whether to grant or reject the license; if companies are caught exporting to unlicensed countries, they can incur in legal penalties.

The tricky part is defining the technologies that need to be licensed accurately enough to not damage those ones that are not the original targets for such controls. Security researchers first and foremost.

Privacy activists claim that the ongoing implementation of export controls - for example recently by the European Union - isn’t harming researchers in any way. A large faction of the security community claims the opposite. I don’t know where the truth stands. What I know is that there is no inclusiveness in this debate, which is starting to appear instead like a rough-and-tumble fight.

I do have my doubts on export controls. I’m in no position to judge whether security researchers are harmed by them or not. This is something that should be subject to rigorous meaningful debate. I mostly question their effectiveness in their current shape.

You’ll never hear export controls advocates say they are a perfect solution, but just a stepping stone. What I find bizarre is that while many legal experts from several European Union member states deem the use of intrusion software to be unconstitutional, we’re now debating on how can we properly and safely export it abroad. Where’s our integrity?

While speaking with legal experts on the matter and with licensing authorities themselves, it seems there is a common agreement that regulating technology itself is a bad idea. What they’re attempting to achieve instead is to discriminate the intentions of the license applicants and determine whether there are the legal and human rights conditions to allow companies to make business in the given country.

I’m skeptical. I find it very arrogant to pretend to be able to effectively determine the legitimacy of intents not only of the applicant, but of the acquiring country as well. I doubt anybody can have the proper insight in the political, social and cultural conditions of foreign countries at any point in time and be able to make accurate judgements upfront. I can also imagine that political alliances and economic interests will largely take precedence over objective concerns of human rights abuses. Libya was an ally of many and an important economic partner, until suddenly everyone realized Gaddafi was a ruthless dictator.

Additionally, if the intent is to control surveillance profiteers who lack any moral compass or ethical principles, shouldn’t we expect them to simply move their businesses abroad probably under way less democratic oversight and accountability?

Furthermore, assuming there is a concrete risk of collateral damage for security researchers as a result of this regulation, and considering the issues I explained above, are export controls a good idea in the first place?

I don’t know. I do not think export controls are going to stop a booming surveillance industry or prevent the abuses that are occurring so commonly nowadays. I don’t think anything will. However, if export controls can be a viable tool for punishing the ones repeatedly working in order to empower dictatorships, I’m not against them. I want liability, accountability and justice for those who refuse to recognize the harm they caused and who refuse to help mitigate it. Let me tell you, out of the many cases I’ve seen - where some victims even went to jail or into exile for their dissent - there hasn’t been a single instance where the suppliers of the regimes humbly acknowledged the incident, let alone assisted in remediation.

Surveillance vendors are the root cause for this problem and I find troubling that they’re capitalizing on the legitimate concerns of the same security researchers they’ve harmed. They make no contributions to our community and should have no part in this debate, which they are instead polluting and manipulating.

Researchers do need protection

The most significant concerns expressed within the security community seem to relate specifically to the licensing of exploits and software vulnerabilities. Frankly, when reflecting on export controls I never even considered exploits. Firstly because I rarely find them used in the surveillance abuses I observe, and secondly because policy makers always reassured me they were not going to be regulated due to the technical difficulties surrounding them.

As a matter of fact, it is still not clear to me whether that’s the case. I had no part in drafting any export control legislation, nor am I well-informed on any particular implementation. If software vulnerabilities might in fact fall within the current controls, these would have to be the non-negotiable terms I’d require:

  • The regulation must in no way affect the free flow of security research done in the public interest, especially when provided to the public.

  • The regulation must in no way prevent security researchers from reporting vulnerabilities to bug bounties, directly to the vendors or through other intermediaries. The process of researching, discovering and disclosing software vulnerabilities is sacred to the improvements of our systems, and must therefore be preserved and protected. As Chris Evans put it, “regulate those who massively profit from putting others at risk” instead.

  • The regulation must in no way affect the distribution of software, including commercial, that is intended for security testing, auditing and defense purposes. I’m hopeful there are ways to define semantic differences between security products and surveillance services.

  • The regulation should not apply to any free and open source software.

If it is not possible to guarantee these basic requirements, then I would agree that an implementation of an export control legislation must be avoided. Perhaps we can achieve accountability through other means. Collateral damage is bad, but impunity is just as bad.

Time to evolve

Nobody is intentionally trying to obstruct security research. If there was no involvement of the security research community in the regulatory process, it’s probably because the policy makers mistakenly did not recognize the possibility of collateral damage. They probably thought they had it figured out and we were not there to prove them wrong. It’s our fault as much as theirs.

This is a lesson to learn. Security has grown to be an asset in the hands of a very diversified pool of actors: researchers, hackers, governments, corporations, profiteers, scumbags, activists, politicians. As a security community we need to be more participant in the public and political debate, because nobody is going to protect our interests and preserve our rights for us. We have a role in society, and we have the duty to be active players in shaping its course.

Security and technology have become major political agenda items and they will be regulated, whether we like it or not. Avoiding a meaningful participation is an abdication of responsibility.