2020-04-24
------------------------------------------------------------------

Here's a quote from:

    Carnegie Council Audio Podcast

    "Ethics, Surveillance, and the Coronavirus Pandemic"

    Arthur Holland Michel

    Published April 20, 2020

------------------------------------------------------------------   

The worrying thing is that if a technology has not been outed by
some heroic reporter, usually, who has gotten an anonymous tip or 
has been combing through obscure public records, we don't actually
have any way to know about it.

The default of a lot of the companies that operate in this space,
and of course of the users that operate in this space, is to avoid
public scrutiny at all costs because it makes their job a lot more
difficult. One of the companies that has been known to provide
location data information to federal law enforcement agencies, 
company called Babel Street, actually has a user agreement where
the law enforcement agency agrees not to disclose its use of this 
technology, even in court proceedings. So even if an individual is
charged with a crime as a result of these investigative
techniques, the government is not allowed to disclose that it used
those techniques in gathering that information.

That's the crux of the issue. Why is it these companies are able
to have those user agreements? How is it the law enforcement
agencies are able to deploy these technologies without announcing
their intentions to the world? It's not because they haven't done
their legal analysis. They have done the analysis and what they 
found was that law says nothing about these technologies.

What is our response when these technologies come to light? One 
day it might be facial recognition, next day aerial surveillance,
cell phone data, the next day something that combines all of those,
who knows. Our response is to call for a ban, often. Or to call for 
fit for purpose regulations that will reign in the use of this
technology and prevent potential abuses. That will potentially be
very useful for that technology and preventing the specific abuses
that arise from that technology but it's not going to do anything
about the technologies that are still operating in the shadows.
I've seen this happen again and again.

To me it raises the question whether there needs to be a way to 
create principles that can be more broadly applied and can ensure
that technologies cannot be legally deployed because they are 
omitted from law. In the same way that if a company makes a new 
type of food, they need to get FDA approval, right? Companies
that make surveillance technology don't have that regulatory 
or even moral obligation.
 
------------------------------------------------------------------