At the start of the pandemic, technologists and policymakers touted the promise of technology to track and warn individuals of potential COVID-19 exposure and high-risk areas. But whether due to overburdened contact tracers or the lack of early and coordinated adoption, those technologies never became a central part of the public health effort against the disease.
With attention now on various schemes for “vaccine passports,” a concept posing threats to privacy and other civil liberties, and as the country nears one in five adults fully vaccinated, civil society, policymakers, and the public must remain wary of efforts to cement health and contact-tracing apps into everyday life, and continue to ask targeted questions of their use.
If not, such health surveillance apps — justified as necessary to monitor lingering cases, verify vaccination and health status for travel, and predict future outbreaks — may well become a lasting and increasingly invasive feature of public life.
Since the outbreak of the virus, at least 24 states and Washington, D.C. have rolled out exposure notification apps through Google and Apple’s Exposure Notification framework, GAEN (using Bluetooth signals and randomly generated keys to track possible exposures), and Exposure Notifications Express (allowing public health authorities to access the framework without maintaining or building their own app). These tools are more privacy-protective than other location-based tracking proposals we have seen. Save for a few worrying instances, the vast majority of state contact tracing apps have remained voluntary and largely developed through GAEN, something many other countries cannot say.
Still, several privacy-violative deployments in the U.S. should not be replicated. In North and South Dakota, the developer of the states’ Care19 app — designed to “anonymously” cache locations visited by users for more than ten minutes — was found to have secretly violated its own privacy policy by sharing users’ location data and personal identifiers with third-party apps, including Foursquare. In Utah, a $2.75 million app called Healthy Together, touted to utilize GPS and Bluetooth to augment contact tracing, became largely “a waste” three months after its launch when state officials shut off the location tracking feature because of widespread refusal to download the app.
College campuses were home to some of the most egregious cases of technology-assisted contact tracing, reflecting administrators’ wide latitude to effect mandatory policies as well as the influence of aggressive tech company marketing on unwitting administrators. At Michigan’s Albion College, a mandatory app called Aura used real-time location tracking to ensure students never left grounds, nor switched off their location — and, if they did, were locked out of buildings and faced suspension. One savvy student looking into the app’s source code found the security keys to the app’s backend servers, revealing students’ names, addresses, test results, and dates of birth.
Other schools like Harvard University and University of California, Irvine have used Wi-Fi tracking to monitor students’ movements and crowd flow. As a general rule, the use of location tracking is extremely problematic. It is both insufficiently accurate for contact tracing and violates the Fourth Amendment when used by law enforcement without a warrant.
We were also unhappy to see policies like those at James Madison University that mandated the use of tracking apps. With no formal appeals process, students and faculty could be reported to campus police and academic heads for failing to pass a five-question symptom survey, and campus community members were encouraged to inform on one another regarding suspected violations. Beyond being an instance of health theater, such coercive policies heavily incentivize false responses and risk being disparately enforced.
We’ve also seen the deployment of facial recognition and physiological surveillance on some campuses to fight COVID-19. Molloy College, for example, installed face recognition temperature kiosks, despite the technology’s highly dubious effectiveness, placing them in central campus buildings and dormitories and linking them to campus identification systems. The University of Southern California, one of the earliest campuses to adopt fingerprint scanning technology for access to certain campus buildings and dorms, recently replaced them with mandatory facial recognition scanners. In Michigan, Oakland University has distributed a wearable device, known as the BioButton, with a 90-day battery life to continuously log skin temperature, respiratory rate, and resting heart rate. Although voluntary, the technology is an example of continuous surveillance hastily implemented, without large-scale testing or FDA certifications as to effectiveness — and which puts the burden on students to ensure their private health information is expunged from third-party company records.
Though these instances of campus and state overreach are far from the norm, overbroad efforts to curb and track COVID-19 leave the door open to an abiding surveillance apparatus that won’t be dissolved once the public emergency dust settles. As the Biden administration looks into the interoperability of contact tracing apps, tech companies like sp0n — the creators of the controversial neighborhood safety app Citizen — are partnering with cities for digital contact tracing, while others investigate how contact tracing apps might double as digital immunity and vaccination passports for global travel.
As always, we ought to remain open to creative and privacy-protective ways of using technology during disease outbreaks. Concurrently, we have a duty to ensure that temporary COVID-19 data surveillance infrastructures do not take hold to outlast the effects of this once-in-a-century pandemic.
Published April 16, 2021 at 09:54PM
via ACLU https://ift.tt/2Q0mLbl
No comments:
Post a Comment