Disappointing Ben Franklin: Tough Choices Between Safety And Privacy
by Bryan Thomas Schmidt & Brian Gifford
We science-fiction fans love new technology. From drones to spaceships to self-driving cars and robots, the annals of science fiction are filled with great stories in which tech plays a major part. That tech has many applications, and we don’t always love the “what-if” questions it raises. This is especially the case when that technology is employed by law enforcement because the answers to those questions can directly impact our safety and privacy.
During a debate over cities banning the use of facial recognition technology, Jennifer Jones, a staff attorney for the ACLU of Northern California, stated, “Police departments are exploiting people’s fears about that crime to amass more power. This has been true for decades, we see new technologies being pushed in moments of crisis.”1 For many, this is the bottom line of the debate over questions of privacy and safety and law enforcement technology. For some, privacy is paramount, but for others, safety wins every time.
In a 2012 Supreme Court decision, U.S. vs. Jones, related to the use of GPS tracking information, the Supreme Court ruled against the use of such techniques without warrants. It found the use of such long-term surveillance builds “a precise, comprehensive record of a person’s public movements that reflects a wealth of detail about her familial, political, professional, religious, and sexual associations” sufficient to qualify as an unreasonable search absent a warrant.2
For most of us, privacy is becoming difficult to find in a world of social media, widespread traffic and CCTV cameras, and such newer technologies as Alexa, facial recognition, Automatic License Plate Readers, pole cams, Acoustic Gunshot Detection Systems, body cams, drones—the list can be exhausting. All of these technologies have the potential to provide users with improved safety and security, and every one of them could be used irresponsibly. This ultimately begs the question: how do we define the responsible use of public safety technologies?
Benjamin Franklin once said, “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.”3 We value safety almost as much as freedom and privacy, and technological advancement has continued to put these cardinal values at odds with each other. Information availability provides both security and risk; having home cameras available through the internet lets a homeowner see what’s happening when they’re away but allows a skilled criminal to see the same thing. This dichotomy is especially apparent when we consider the implications of advanced technologies for law enforcement surveillance.
One of the inherent problems with the storage, use, and protection of surveillance data is that responsible use hasn’t been consistently defined by policymakers. While some guidance has been offered in places like the 2020 Privacy Act,4 organizations have been left to their own devices to create practical policies.5 Consequently, limitations are often defined by public perception and the impact of breaches, rather than by any well-considered or coherent set of rules.
Introducing new technologies always brings a variety of new issues. For example, Automatic License Plate Readers provide the ability for police to go back in time and identify where vehicles have been seen prior to a crime being committed. Storage and use of this data has been a sticking point for organizations such as the ACLU because it represents surveillance prior to the determination of probable cause.6 Law enforcement argues that the information is being found in public space, which is defined as space open and available to the general public.7 This is distinct from something like using cell phone location history because the information on the license plates is being seen in the public domain and is being recorded by the police department’s own resources, whereas acquiring cell phone geo-location information requires the active participation of cell phone companies.
Many devices and services, especially internet-enabled ones such as Alexa and Siri, collect data as a matter of course and for a variety of reasons. Users often acknowledge End User License Agreements without bothering to read them, leaving consumers little to no recourse when they discover the amazing array of information being gathered in support of targeted advertising, “application improvement,” and safety.8 Social media companies like Facebook, now known as Meta, and Google have long been two of the most public nexuses for big data collection and are periodically accused of a variety of abuses from privacy seekers and public safety organizations both.9
In defending themselves, social media companies argue that targeted advertisements and services are a net benefit to the user, and information offered for commercial consumption is anonymized to prevent the accidental disclosure of personally identifiable information (PII). The flipside of the argument is represented by a list of breaches and data aggregation issues that have plagued users and cost many millions of dollars for identity theft and other fraudulent uses. The motive in collecting all of this data is evident in the enormous revenues companies such as Meta, Amazon, and Google gather from selling targeted data to advertisers and other commercial interests.9
These issues are closely tied to issues of privacy in public service, especially as regards data use and retention. Many Public Safety organizations have begun distancing themselves from systems such as Automatic License Plate Readers because of the cost and exposure the systems bring along with them. The ALPR systems generate historical data that must be maintained according to each department’s data retention policies, but the information represents a wealth of PII that comes with a host of privacy concerns. Breaches of this type of data expose departments to damaging lawsuits and public relations nightmares and expose the population they are charged with keeping safe.10
Many states have introduced statutes limiting the retention time for surveillance data with the intent of addressing public concerns and allowing departments to safely use systems that offer myriad public benefits. Examples include historical information on where a vehicle has been seen, patterns of a suspect’s movements, and general traffic pattern analysis.11 Even with these policies in place, the systems are still seen as potential problems. Several organizations, most notably the ACLU, have expressed serious concerns about the use of these systems based upon statistical analysis of the information captured and how it’s used. Historically disenfranchised populations represent the majority of those tracked, suggesting that the information gathered by those systems could be supporting racial and economic bias.6
Another question technology changes evoke is what is the balance between cost and convenience? Facial recognition technologies now being tested by the Transportation Security Administration replace boarding passes with scans that compare a traveler’s face with their ID.12 Some travelers are uncomfortable with face-scanning technology, but the system allows users to pass through security checkpoints more rapidly. Those who refuse to participate in the new program wind up enduring longer waits. As a result, nonparticipation represents a silent tax on traveler’s time that many lack the patience to endure. This is a common theme among new technologies offering efficiency or comfort over privacy. What is the price we’re willing to pay for convenience and where should the line be drawn? The answer may differ for everyone but is not so easily debated when time pressure is the driving factor. Worse, the more people opting-in to new technologies, the more people who are then forced to use the new systems because alternatives cease to be available.
Ideas about what the public wants, what the public needs, and what they have a right to expect may depend on the individual, but for organizations and law enforcement, “what can we afford” becomes the deciding factor. A good example is adoption by police departments of body cameras. While the press and protestors loudly demanded it in 2020 after a series of incidents of police brutality beginning with George Floyd, even three years later some departments are still struggling to meet those expectations. However, law enforcement agencies began adopting the new technology en masse after the death of Michael Brown in Ferguson, Missouri in 2014. As of 2016, the National Institute of Justice found that 80% of large police departments had body worn cameras available, and 87% of those departments had a written policy in place for their use. The NIJ study went on to say that the departments that had not yet adopted body worn cameras had mostly used the costs and maintenance of the systems as their reasoning. 13
There are numerous examples of cases where convenience has trumped privacy concerns, such as the case of the EZ Pass system many states employ for tracking tolls on interstates. These systems collect personal data that, while not generally used for law enforcement purposes, has that potential.14 Yet most drivers probably never give it a second thought because EZ Pass technology has become such a part of the landscape.
Another piece of tech that has become common is unmanned aerial systems, also known as drones. Drones lack oversight or common legal limitations between states, contributing to confusion and misuse by owners. Federal statutes governing their use, with the exception of some of the FAA’s safety limitations, are notable in their absence. 15 An additional factor is their popularity; a wide spectrum of public and private users operate drones in public space for an equally wide variety of purposes. The definition of the boundaries of “public space” varies between jurisdictions, making police use of these impressive tools a continually evolving challenge. What a “person in the street” can see at ground level and what that same GPS location can observe from fifty feet in the air are widely different, yet depending on the municipality, those perspectives might be legally identical. 16
An additional factor is public perception, which drones illustrate viscerally. Drones have been in the news for decades and are often confused with weapon-carrying military systems routinely seen in the news blowing up bad guys in areas of conflict. But for most law enforcement agencies and organizations, weaponized drones remain firmly in the area of fiction. Despite this reality, public perception of these systems impacts decision making when it comes to law enforcement entities deploying these critical tools.
The question of privacy vs. safety is an ongoing debate that’s neither easily nor quickly resolved. Technology forces privacy to constantly evolve. The interplay between the progress of technology and the evolution of our privacy isn’t likely to be resolved soon, and it’s an unending cycle of development and counter-development. While we may one day be able to have both unfettered privacy and complete safety, the way public safety technologies gather information makes that day seem further off than ever. Hallowed privacy can still be found, but the price of technological alone-time is paid with reduced access to the world and inconvenience. For most of us, that price is just too high, and we will choose to accept more prying eyes if we can turn our lights off without getting out of our chair, get to work more quickly, and get through airport security without showing ID. Those who wish otherwise will just have to be along for the trip, or else find another ride entirely.
On October 22nd, 2022 in the Philippines, Bryan’s close friend was murdered outside his fiancé’s home. Police used GPS tracking, cell phone history, CCTV, social media chat history, and various crime scene tech right out of CSI to not only track down, but identify the five people responsible for her death—including three hired killers and the two relatives who paid them. They managed to do it in thirty-five days, a near miracle for a country whose population—including most civil servants like police—live in poverty on limited resources. Even in far more wealthy and resource rich countries, cases can drag on much longer. As concerned as we all are about privacy and safety, after watching his loved ones put in police protection as witnesses and facing daily fear for their lives, Bryan felt more than glad that the technology was available to keep them safe, even if compromises in privacy were required. Certainly, people directly impacted by crimes have a different perspective. But finding a common ground we can all live with remains a goal worthy of ongoing effort.
1 Dave, P. (2022, May 12). U.S. Cities are Backing Off Banning Facial Recognition as Crime Rises. https://www.reuters.com/world/us/us-cities-are-backing-off-banning-facial-recognition-crime-rises-2022-05-12/
2 Legal Information Institute. (2012, January 23). U.S. Supreme Court, United States v. Jones. https://www.law.cornell.edu/supremecourt/text/10-1259
3 Franklin, B. (1755). Votes and Proceedings of the House of Representatives, 1755–1756. Philidelphia: The National Archives. https://founders.archives.gov/documents/Franklin/01-06-02-0107
4 U.S. Department of Justice. (2022, October 4). Overview of the Privacy Act: 2020 Edition. https://www.justice.gov/opcl/overview-privacy-act-1974-2020-edition/introduction
5 International Association of Privacy Professionals. (2022). Organizational Provacy Policies. https://iapp.org/resources/topics/organizational-privacy-policies/
6 Crump, C. (2013, July). You Are Being Tracked: How License Plate Readers are Being Used to Record Americans’ Movements. https://www.aclu.org/documents/you-are-being-tracked-how-license-plate-readers-are-being-used-record-americans-movements?
7 Miller, K. F. (2007). Designs on the Public: The Private Lives of New York’s Public Spaces. University of Minnesota Press.
8 Cybersecurity & Infrastructure Security Agency. Reviewing End-User License Agreements. https://www.cisa.gov/sites/default/files/publications/EULA.pdf
9 FTC.gov. (2019, July 24). FTC Imposes $5 Billion Penalty and Sweeping New Privacy Restrictions on Facebook. https://www.ftc.gov/news-events/news/press-releases/2019/07/ftc-imposes-5-billion-penalty-sweeping-new-privacy-restrictions-facebook
10 Krebs, B. (2022, May 12). DEA Investigating Breach of Law Enforcement Data Portal. https://krebsonsecurity.com/2022/05/dea-investigating-breach-of-law-enforcement-data-portal/
11 California State Auditor. (2020, February). Automated Licence Plate Readers.
12 Transportation Security Administration. (2022). Evaluating Facial Identification Technology—Airport Proofs of Concept. https://www.tsa.gov/biometrics-technology/evaluating-facial-identification-technology
13 National Institute of Justice. (2022, January 7). Research on Body-Worn Cameras and Law Enforcement. https://nij.ojp.gov/topics/articles/research-body-worn-cameras-and-law-enforcement
14 Hirose, M. (2015, April 24). Newly Obtained Records Reveal Extensive Monitoring of E-ZPass Tags Throughout New York. https://www.aclu.org/news/privacy-technology/newly-obtained-records-reveal-extensive-monitoring-e-zpass
15 Federal Aviation Administration. (2022). Drones. https://www.faa.gov/uas
16 SupremeCourt.gov. (2022, February 22). Travis Tuggle v. United States.
Bryan Thomas Schmidt is the #1 bestselling author and Hugo-nominated editor of eleven novels and twenty-two anthologies. His latest anthology, Robots Through the Ages, was coedited by Robert Silverberg. His near future hard science fiction novel Shortcut releases mid-September.
Brian Gifford is a twenty-five-year military veteran and the proud father of three amazing boys. The product of computer geeks, he was raised in an environment that promoted the free exchange of ideas and opinions, so long as it was understood that his mother was always right. He continues the tradition to this day with his wife, at great cost to his hairline. Brian’s scifi novel The Dark Bringer is forthcoming and several short stories published.
Copyright © 2023 Bryan Thomas Schmidt & Brian Gifford