The people of America have whipped out their smart phones and called their government to the stand. The social issues that have been offending our citizens for centuries are on trial with cell phone technology providing the brunt of incriminating evidence. Cool software, initially designed for entertainment and lifestyle efficiency, morphed into a crucial weapon against injustice. By attaching visuals to the externalities of policy absence, abuse, and reform, smart phones have held American authorities accountable to the people they swear to protect. But, smart phone technology is flawed – just like its human creators. Software programming bias has caused society’s most serious catastrophes, and discriminatory behaviors that were once only fears in the brick and mortar world have algorithmically evolved into covert inequity.

Smart Phones, Serious Privacy Invasion

Smart phones are physical manifestations of our most private thoughts, interests, and desires. They are location trackers that completely relinquish our anonymity; evidence boards with years of biographical data; and gateways between our three-dimensional and internet realities. Backend access to this data means that condemnatory narratives can be formed before individuals have the opportunity to explain. This probably isn’t worrisome for those that live quiet lives, or those that live comfortably outside the realm of predatory marketing. But, this is alarming for the student activists, journalists, community organizers, hot button issue advocates, and minorities that generally receive the short end of the criminal justice stick- and life in general.

During inauguration weekend, a chaotic time of American uncertainty, officials seized protestors’ smart phones and cracked the passwords to view the internal content. After accessing call detail records, email logs, text messages, messaging applications, website search history, and images/videos, police charged the protestors with conspiracy to riot. Black Lives Matter activists have been regularly monitored through the cell phone GPS locator that connects public social media posts to geographic tags. Stingrays, cell phone surveillance devices that mimic cell phone towers and send out signals that trick phones into transmitting their locations and identifying information, have been used to track and capture undocumented immigrants. Uber and Lyft track location data when consumers are not using the app. Twitter and Instagram track and store data analytics to share with advertisers. Facebook collects users’ face prints to fuel their photo tagging feature. And, Google, which owns all of its search history and Gmail content, uses information gleaned from users’ email, search results, map requests, and YouTube views to create and specialize advertising content. If granted a warrant, law enforcement can gain access to all of this information.

New Technology, New Concerns

This year, both Apple and Samsung introduced new generation smart phones with facial recognition software. Face ID locks the device until the owner simply looks at it. With one glance, the user gains complete access. In the current political climate, this software warrants concern for police and civilian misconduct. Face ID is not a strong security measure. In fact, Samsung’s device was already tricked with a photograph. Moreover, facial recognition systems have a history of racial bias attributable to the lack of diversity in product development: algorithms trained on mostly White faces have higher error rates when interacting with Black, Chinese, and Indian faces. No reports have been published on databases that consider transitioning members of the LGBTQIA community. However, Georgetown Law published a report showing that 1 in 2 adults in the U.S. have their images in a facial recognition network. Police departments can search these faces without regulation using algorithms that have not been audited for accuracy. Law enforcement is also embracing machine learning in conjunction with facial recognition software for predictive policing. Some judges are using machine generated risk scores to determine the length of prison sentences.

So, in this political climate where islamophobia, racism, sexism, and gender discrimination are bigots’ primary weapons of terror, it is impossible to ignore the repercussions Face ID could produce in the heightened tension of a wrongful arrest, peaceful protest gone awry, or chance encounter between two strangers with opposing beliefs and ethics. There is even the concern that a person with a bruised, bloodied, or swollen face may not be able to enter his or her own phone to access specific contacts or live apps that have been the sole link between crime and justice. Here lies the double-edged sword of innovation. Apple and Samsung does allow users to opt out of biometric security and they do disclose their data usages practices in their privacy policies. But, this is not sufficient because Face ID is a social impacting technology with security concerns far beyond the scope of advertising.

New Innovation, No Regulation

Furthermore, there are no constitutional protections for biometric security measures. Adi Roberston, Senior Reporter for technology news publication The Verge, writes that the Fifth Amendment, which protects individuals from having to incriminate themselves, holds that passcodes are “testimonial” evidence. Passcodes are not legally obligated to be disclosed because doing so would mean answering a question based on the contents of one’s thoughts, not physical evidence. However, security experts have warned that fingerprints do not fall under this rule and that face scanning likely would not either. Standing there while a law enforcement officer holds a phone up to your face, eye, or picture is not a “testimonial” act because it does not require the suspect to provide any information that is inside his or her mind. A Virginia judge, Steven Frucci, let police use a fingerprint to unlock a phone in 2014, and other courts granted similar requests in 2016 and 2017.

Although very cool, face scanning is another invasive software. It has the ability to discriminate against African Americans, Hispanics, and immigrants battling police brutality, racial profiling, and travel bans in Trump’s America. Implications for transitioning members of the LGBTQIA community are also up for debate. This technology, in conjunction with those previously mentioned, is another unregulated puzzle piece in the infrastructure of authoritarianism. Technology is exploding, while simultaneously expanding the negative impact of all of our human prejudices, blindspots, and flaws. Yet, society is focused on the cool factor. Well, at least until the daily news uncovers another disenfranchised community and public servants scramble to regulate the privacy and security threats of innovation.

Facebook Comments
Please follow and like us:

LEAVE A REPLY