You are hereFeed aggregator / Categories / Privacy

Privacy


EPIC Urges Department of Defense to Limit Disclosure of Personnel Records

EPIC - Fri, 2018-11-16 16:10

In comments to the Department of Defense, EPIC has proposed privacy safeguards for the agency's Personnel Vetting system of records. The records system would authorize limitless collection of sensitive information on current, former, and prospective public and private sector employees, their friends and relatives, Red Cross volunteers, and foreign nationals. EPIC opposes the records system's disclosure standards that authorize sharing of individuals' personal information with any requesting source as part of an investigation, including U.S. Citizenship and Immigration Services and foreign law enforcement entities. EPIC consistently warns against overbroad government databases and urges agencies to withdraw unnecessary Privacy Act exemptions.

Categories: Privacy

Pew Research: Widespread Concerns in US About AI

EPIC - Fri, 2018-11-16 14:30

A new survey from the Pew Research Center "Public Attitudes Toward Computer Algorithms" found widespread concern about the fairness of automated decision making. According to the Pew report, "Americans express broad concerns over the fairness and effectiveness of computer programs making important decisions in people's lives." Americans oppose the use algorithms for criminal risk assessments (56%), automated resume screening for job applicants (57%), and personal finance scores (68%). Many of the concerns in the Pew Report are addressed in the Universal Guidelines for AI, the first human rights framework for AI. More than 200 experts and 50 NGOs have endorsed the Universal Guidelines. Public opinion polls consistently find strong support among Americans for new privacy laws.

Categories: Privacy

Leaks Show Europe's Attempts to Fix the Copyright Directive Are Failing

EFF News - Fri, 2018-11-16 05:32

The EU’s “Copyright in the Digital Single Market Directive” is closer than ever to becoming law in 28 European countries, and the deep structural flaws in its most controversial clauses have never been more evident.

Some background: the European Union had long planned on introducing a new copyright directive in 2018, updating the previous directive from 2001. The EU's experts weighed a number of proposals, producing official recommendations on what should (and shouldn't) be included in the new directive, and meeting with stakeholders to draft language suitable for adoption into the EU member states' national laws.

Two proposals were firmly rejected by the EU's experts: Article 11, which would limit who could link to news articles and under which circumstances; and Article 13, which would force online platforms to censor their users' text, video, audio, code, still images, etc., based on a crowdsourced database of allegedly copyrighted works.

But despite the EU's expert advice, these clauses were re-introduced at the last minute, at a stage in the directive's progress where they would be unlikely to receive scrutiny or debate. Thankfully, after news of the articles spread across the Internet, Europe’s own voters took action and one million Europeans wrote to their MEPs to demand additional debate. When that debate took place in September, a divided opposition to the proposals allowed them to continue on to the next phase.

Now, the directive is in the final leg of its journey into law: the "trilogues," where the national governments of Europe negotiate with the EU's officials to produce a final draft that will be presented to the Parliament for a vote.

The trilogues over the new directive are the first in EU history where the public are allowed some insight into the process, thanks to a European Court of Justice ruling that allows members of the European Parliament to publicly disclose the details of the trilogues. German Pirate Party MEP Julia Reda has been publishing regular updates from behind the trilogues' closed doors.

It's anything but an orderly process. A change in the Italian government prompted the country to withdraw its support for the directive. Together with those nations that were already unsure of the articles, this means that there are enough opposing countries to kill the directive. However, the opposition remains divided over tactics and that means that the directive is still proceeding through the trilogues.

The latest news is a leaked set of proposed revisions to the directive, aimed at correcting the extraordinarily sloppy drafting of Articles 11 and 13.

These revisions are a mixed bag. In a few cases, they bring much-needed clarity to the proposals, but in other cases, they actually worsen the proposals—for example, the existing language holds out the possibility that platforms could avoid using automated copyright filters (which are viewed as a recipe for disaster by the world's leading computer scientists, including the inventors of the web and the Internet's core technologies). The proposed clarification eliminates that possibility.

To get a sense of how not-ready-for-action Articles 11 and 13 are in their current form, or with the proposed revisions from the trilogues, have a look at the proposals from the Don't Wreck the Net coalition, which combines civil society groups and a variety of small and large platforms from the US and the EU, who have produced their own list of the defects in the directive that have to be corrected before anyone can figure out what they mean and even try to obey them. Here are a few:

  • Make it explicit that existing liability protections, such as those in the E-Commerce Directive, remain in place even under Article 13.
  • Clearly define what is meant by “appropriate and proportionate,” as it provides absolutely no guidance to service providers and is left wide open for litigation and abuse.
  • Clarify which “service providers” Article 13 applies to in much more detail. This includes a clear definition of “public access to large amounts of works.” What is “large”?
  • There should be clear and significant penalties for providing false reports of infringement.
  • Copyright holders should be required to help platforms identify specific cases of infringement to be addressed, rather than requiring service providers to police every corner of their services.
  • There need to be clear exceptions for sites that make a good faith effort to comply, but that inadvertently allow some infringement to slip through on their platforms.
  • There should be required transparency reports on how Article 13 is being used, including reports on abusive claims of infringement.

We're disappointed to see how little progress the trilogues have made in the months since they disappeared behind their closed doors. The proponents of Articles 11 and 13 have had years to do their homework and draft fit-for-purpose rules that can be parsed by governments, companies, and users, but instead they've smuggled a hastily drafted, nebulous pair of dangerous proposals into a law that will profoundly affect the digital lives of more than 500 million Europeans. The lack of progress since suggests that the forces pushing for Articles 11 and 13 have no idea how to fix the unfixable, and are prepared to simply foist them on the EU, warts and all.

Categories: Privacy

Facing EPIC Lawsuit, FAA Scraps Secretive Drone Committees

EPIC - Thu, 2018-11-15 11:15

The FAA's Drone Advisory Committee, facing an open government lawsuit from EPIC, has scrapped the secretive committees that developed drone policy. EPIC filed a lawsuit challenging the closed-door meetings with agency officials and industry reps. EPIC also charged that the FAA ignored the privacy risks posed by the deployment of drones—even after identifying privacy as a top public concern. The FAA acknowledged that the committees provided policy advice, but the FAA failed to comply with open government laws. EPIC has a long history of promoting government transparency and advocating for privacy protections against drones.

Categories: Privacy

EFF and MuckRock Release Records and Data from 200 Law Enforcement Agencies' Automated License Plate Reader Programs

EFF News - Thu, 2018-11-15 10:09

EFF and MuckRock have filed hundreds of public records requests with law enforcement agencies around the country to reveal how data collected from automated license plate readers (ALPR) is used to track the travel patterns of drivers. We focused exclusively on departments that contract with surveillance vendor Vigilant Solutions to share data between their ALPR systems.

Today we are releasing records obtained from 200 agencies, accounting for more than 2.5-billion license plate scans in 2016 and 2017. This data is collected regardless of whether the vehicle or its owner or driver are suspected of being involved in a crime. In fact, the information shows that 99.5% of the license plates scanned were not under suspicion at the time the vehicles’ plates were collected.

On average, agencies are sharing data with a minimum of 160 other agencies through Vigilant Solutions’ LEARN system, though many agencies are sharing data with over 800 separate entities.

Click below to explore EFF and MuckRock’s dataset and learn how much data these agencies are collecting and how they are sharing it. We've made the information searchable and downloadable as a CSV file. You can also read the source documents on DocumentCloud or track our ongoing requests. 

Read the Report and Explore the datA  

DATA DRIVEN: HOW COPS ARE COLLECTING AND SHARING OUR TRAVEL PATTERNS USING AUTOMATED LICENSE PLATE READERS

Related Cases: Automated License Plate Readers- ACLU of Southern California & EFF v. LAPD & LASDAutomated License Plate Readers (ALPR)

Categories: Privacy

Honoring the 2018 Pioneer Award Winners and John Perry Barlow

EFF News - Wed, 2018-11-14 16:52

EFF’s annual Pioneer Awards Ceremony recognizes extraordinary individuals for their commitment and leadership in extending freedom and innovation on the electronic frontier. At this year’s event held on September 27 in San Francisco, EFF rededicated the Pioneer Awards to EFF co-founder and Grateful Dead lyricist John Perry Barlow. Barlow’s commitment to online freedom was commemorated by dubbing the Pioneer Awards statuette the “Barlow.” EFF welcomed keynote speaker Daniel Ellsberg, known for his work in releasing the Pentagon papers, to help award the very first Barlows. This year's honorees were fair use champion Stephanie Lenz, European Digital Rights leader Joe McNamee, and groundbreaking content moderation researcher Sarah T. Roberts.

Read the transcript of the full 2018 Pioneer Awards Ceremony here.

The evening kicked off with EFF Executive Director Cindy Cohn who had the honor of renaming the Pioneer Award as the “Barlow” to pay tribute to Barlow’s work creating EFF and his role in establishing the movement for Internet freedom. “Barlow was one of the first people to see the potential of the Internet as a place of freedom where voices long silenced could find an audience and people could connect, regardless of physical distance,” Cohn said. (If you’re an award winner and reading this, you’ll be happy to know she also gave the green light to previous award winners to retroactively call their awards the Barlow.)

Cindy Cohn dedicates the Pioneer Awards to EFF co-founder John Perry Barlow.

Cohn introduced two of his daughters, Anna and Amelia Barlow (known affectionately as the Barlowettes), to the stage to share some words. Anna paid tribute to her father’s talents and his ability to weave two worlds together, and shared a video of him speaking about the necessity of the physical world to provide a framework for love, illustrating his theory of life.

Amelia centered on gratitude sharing funny anecdotes on her father’s ancestral connections to early America. She emphasized the important of perceptivity, respect, and wisdom needed in the information era in carrying her father’s legacy forward, and in an emotional moment told the room, “I really feel like those people are you.” She continued, “Maybe we all will be guided by the wisdom of those who have come before us and not forget what is true as a means of seeking a beautiful future with the long view, the long game, and all beings in mind.”

Amelia Barlow addresses the audience during the 'Barlow' dedication.

Cohn introduced Daniel Ellsberg, Barlow’s friend and board member of the Freedom of the Press Foundation. Ellsberg’s release of the Pentagon papers in 1971 exposed U.S. criminality in Vietnam at great personal risk to himself, and he has since tirelessly supported whistleblowers and worked to shed light on government surveillance. Cohn highlighted Ellsberg’s understanding of how national security can affect the psyche of a government official as secrecy becomes more than a job, but an identity. This makes it even more difficult for whistleblowers to step forward. “I can honestly say that without you as a role model for breaking out of the secrecy cult, the NSA's mass surveillance programs would still likely be a secret to this day,” she said as she thanked him for his service.

Daniel Ellsberg and Cindy Cohn

Ellsberg took the stage to a standing ovation, and shared his impressions of the Supreme Court hearings. “I believed Anita Hill then. I believe Christine Blasey Ford now,” he said. He told the story of how he first met Barlow and Barlow called him a “revelationary,” a term, he mused humorously, that was “a lot better than ‘whistleblower.’”

A high point was hearing Ellsworth call the Pioneer Awards the most exciting day of his life, as he was finally able to meet Chelsea Manning, who was in attendance that evening. He joked he had missed her many times, once seeing the back of her head. “But I waited 39 years for her to appear in this world,” he said before continuing on to detail the significance of the documents she leaked. He went on the praise both Manning and Edward Snowden, “I have often said that I identify more with them as revelationaries than with any other people in the world.”

"Here's the great thing about the choice to become an advocate: anyone can make it.”

EFF Legal Director Corynne McSherry introduced honoree Stephanie Lenz. Lenz became a fair use hero when, with the assistance of EFF, she sued Universal Studios for sending her a takedown notice (taking advantage of the Digital Millennial Copyright Act) for a 29-second YouTube video of her kids dancing to Prince’s song “Let’s Go Crazy,” even though her video was legitimate fair use. The fight has taken ten years to win, but Lenz never gave up. “Stephanie Lenz is not most people. She decided to take another course. She decided to fight back,” said McSherry. In doing so, Lenz became a voice for thousands of users who have had their work taken down unfairly – and she made history. Lenz encouraged the audience to all become activists, “I could've chosen silence. I chose speech. Here's the great thing about the choice to become an advocate: anyone can make it.”

Corynne McSherry and Stephanie Lenz, winner for her fight for fair use.

Danny O’Brien introduced the next honoree, Joe McNamee, and humorously praised his humility, stating that McNamee only agreed to accept the award if he could do so on behalf of his colleagues. True to his word, O’Brien presented the award to McNamee and the European community. “Anyone know who that guy is?” quipped McNamee.

Danny O'Brien and Joe McNamee, winner for his work with European Digital Rights.

McNamee is Executive Director of European Digital Rights, Europe’s association for organizations supporting online freedom. From his home base of Brussels, he pioneered digital rights advocacy in Europe with his work in net neutrality and General Data Protection Regulation or GDPR, and notably, was a centralizing force for diverse groups from politicians to activists to come together. McNamee shared his concern for the copyright directive and the problems that arise from companies implementing policies on a global level. He also asked the audience to go home and watch the video of Taren Stinebricker-Kauffman speaking on the banality of evil during her acceptance speech for Aaron Swartz’s posthumous Pioneer Award, “And when you watch that video, be outraged that it's truer today than it was a few years ago. Be proud that you're part of a community that does not accept this banality. And be energized by your outrage to fight the good fight.”

EFF Director of International Free Expression Jillian York presented the evening's last award over live video from Thessaloniki in Greece to Sarah T. Roberts, who was in Greece keynoting a conference on content moderation. Roberts spoke of the hidden labor and the experiences of the workers in content moderation – which is largely invisible – and hoped the award would help “elevate the profile and elevate the experience to these workers that have been hidden for so long.” Roberts’ research on content moderation has been vital in documenting and informing how social media companies use low-wage laborers to the detriment of free expression and the health and well-being of the screeners.

Jillian C. York and Sarah T. Roberts, winner for her commercial content moderation work, LIVE FROM GREECE!

We are deeply grateful to Anna Barlow, Amelia Barlow, Daniel Ellsberg, and all of this year’s honorees for their contributions in the digital world and far beyond. This was truly an ideal group to rededicate the Pioneer Awards Ceremony to a visionary like John Perry Barlow.

Awarded every year since 1992, EFF’s Pioneer Awards Ceremony recognizes the leaders who are extending freedom and innovation on the electronic frontier. Honorees are nominated by the public. Previous honorees have included Aaron Swartz, Douglas Engelbart, Richard Stallman, and Anita Borg. Many thanks to the sponsors of the 2018 Pioneer Awards Ceremony: Anonyome Labs; Dropbox; Gandi.net; Ridder, Costa & Johnstone LLP; and Ron Reed. If you or your company are interested in learning more about sponsorship, please contact nicole@eff.org.

The 2018 Barlows revealed!

Categories: Privacy

The Supreme Court Should Confirm, Again, that Abstract Software Patents Don’t Need a Trial to be Proved Invalid

EFF News - Wed, 2018-11-14 15:39

This year, we celebrated the fourth anniversary of the Supreme Court’s landmark ruling in Alice v. CLS Bank. Alice made clear that generic computers do not make abstract ideas eligible for patent protection. Following the decision, district courts across the country started rejecting ineligible abstract patents at early stages of litigation. That has enabled independent software developers and small businesses to fight meritless infringement allegations without taking on the staggering costs and risks of patent litigation. In other words, Alice has made the patent system better at doing what it is supposed to do: promote technological innovation and economic growth.

Unfortunately, Alice’s pro-innovation effects are already in danger. As we’ve explained before, the Federal Circuit’s decision in Berkheimer v. HP Inc. turns Alice upside-down by treating the legal question of patent eligibility as a factual question based on the patent owner’s uncorroborated assertions. That will just make patent litigation take longer and cost more because factual questions generally require expensive discovery and trial before they can be resolved.

Even worse, Berkheimer gives patent owners free rein to actually create factual questions because of its emphasis on a patent’s specification. The specification is the part of the patent that describes the invention and the background state of the art. The Patent Office generally does not have the time or resources to verify whether every statement in the specification is accurate. This means that, in effect, the Berkheimer ruling will allow patent owners to create factual disputes and defeat summary judgment by inserting convenient “facts” into their patent applications.

If permitted to stand, the decision will embolden trolls with software patents to use the ruinous cost of litigation to extract settlement payments for invalid patents—just as they did before Alice. Unfortunately, district courts and patent examiners are already relying on Berkheimer to allow patents that should be canceled under Alice to survive in litigation or issue as granted patents. Berkheimer is good news for patent trolls, but it’s bad news for those most vulnerable to abusive litigation threats—software start-ups, developers, and users.

Now that the Federal Circuit has declined rehearing en banc (with all active judges participating in the decision), only the Supreme Court can prevent Berkheimer’s errors from turning Alice into a dead letter. That’s why EFF, together with the R Street Institute, has filed an amicus brief [PDF] urging the Supreme Court to grant certiorari, and fix yet another flawed Federal Circuit decision.

Our brief explains that Berkheimer is wrong on the law and bad for innovation. First, it exempts patent owners from the rules of federal court litigation by permitting them to rely on uncorroborated statements in a patent specification to avoid speedy judgment under Alice. Second, it conflicts with Supreme Court precedent, which has never required factfinding deciding the legal question of patent eligibility. Third, it threatens to undo the innovation, creativity, and economic growth that Alice has made possible, especially in the software industry, because Alice empowers courts to decide patent eligibility without factfinding or trial.

We hope the Supreme Court grants certiorari and confirms that patent eligibility is a legal question that courts can answer, just as it did in Alice.  

Categories: Privacy

Tech Talk: A News-tritious Media Diet

CDT - Wed, 2018-11-14 12:43

CDT’s Tech Talk is a podcast where we dish on tech and Internet policy, while also explaining what these policies mean to our daily lives. You can find Tech Talk on SoundCloud iTunes, and Google Play, as well as Stitcher and TuneIn.

How informed are you about the news you’re reading online? Do you know if your news diet is balanced and filled with actual facts? For anyone who gets their news online, it’s clear that plenty of people do not have a nutritious news diet. Well, now there’s a tool to help!

The Newseum, Freedom Forum Institute and Our.News recently launched Newstrition, an interactive tool that makes it easy for the public to make informed decisions about what is real and what is “junk” news. In this episode, Lata Nott from the Freedom Forum Institute joins to talk about Newstrition.

After that, we welcome Teddy Hartman from the Howard County Public Schools. He shares a local perspective on how school districts can proactively address student privacy issues. Yes, you can have strong privacy practices and use new technology.

Listen

The post Tech Talk: A News-tritious Media Diet appeared first on Center for Democracy & Technology.

Categories: Privacy

The Midterm Election Showed the Critical Need for Technical Volunteers in 2020

CDT - Tue, 2018-11-13 16:48

The November 2020 General Election already seems like a distant date on the calendar, but for election officials the preparation begins in just a couple of weeks. Performance during the November 2018 Midterm Election is already being assessed so that the lessons learned can be incorporated into plans for primary elections which begin just 15 months from now. 2019 will be the window of opportunity for election officials to make bold moves to continue securing their elections and improving the voting experience for their constituents. One move that can help both of those areas is addressing the lack of technically-capable volunteers.

Rather than the expected reports of cyber-related incidents during the Midterm Election, broken and missing equipment dominated the headlines. In Georgia, electronic poll books used to check-in voters were not writing to the smart cards needed to cast ballots. High humidity caused an unusually-long two-page ballot style to jam scanning machines across New York City. Some straight-party votes cast on DRE machines in Texas appeared to flip selections. These incidents resulted in long lines, dissuaded voters, and lawsuits challenging the validity of the outcome. The danger is that some of these incidents may be interpreted by the public as the malicious acts of foreign adversaries. One solution is tapping into civic-minded community members with technical skills or an IT support (infosec) background to combine their knowledge and talents as technical volunteers.

Election officials already manage a small army of dedicated volunteers that must be identified, vetted, trained, and managed to perform Election Day activities. The traditional volunteers have been adequate for setting up equipment, checking-in voters, and monitoring for compliance. Today, however, they are being asked to perform functions that are much more technical in nature. Modern voting equipment is becoming increasingly more complex and difficult to operate and troubleshoot for users that are not technically-savvy. Volunteers with specific technical abilities and training can perform tasks such as providing in-person tech support at polling locations, assisting with network security, and monitoring social media for misinformation. These tasks are becoming a normal part of election activities. Polling locations with modern equipment such as epollbooks, ballot marking devices, and precinct scanners require volunteers who are comfortable setting up, operating, and troubleshooting them. Voters can sense stressful situations when equipment is not working properly, and that has a negative impact on voter confidence.

“The poll workers were panicked. They just didn’t know how to turn on the machines.”

The cybersecurity job market has tightened, making hiring qualified full-time professionals difficult for many jurisdictions. New ideas to formalize large-scale, on-call technical assistance such as the Civilian Cybersecurity Corps or National Guard deployment will require additional time and legislative/ executive action in order to be realized. Identifying and cultivating a small but powerful base of technically-capable volunteers is something that election officials can do within their existing authority and have in place by 2020. The infosec community can use CDT’s Infosec Toolkit for Election Volunteering to better understand the election process and volunteer roles. Election officials can use CDT’s Election Officials Toolkit for Technical Volunteers to better understand how to vet and deploy technical volunteers.

The post The Midterm Election Showed the Critical Need for Technical Volunteers in 2020 appeared first on Center for Democracy & Technology.

Categories: Privacy

CDT Signs Onto Principles for Privacy Legislation, Calls On NTIA to Promote Robust Privacy Law in Congress

CDT - Tue, 2018-11-13 13:07

With the midterm election now largely settled, all signs point to consumer privacy law as being one area where Congress and the Trump administration can work together to advance rules that will force companies to safeguard and use our information responsibly. In that spirit, today CDT joined with 34 other civil rights, consumer, and privacy advocacy organizations in releasing public interest principles for privacy legislation. Together, we have called for Congress to enact a law that promotes fairness, prevents discrimination, and advances equal opportunity wherever and whenever data is collected, used, or shared. We have also filed comments with the Trump administration calling for the same.

The unfortunate reality is that many commercial practices today are simply not fair to individual consumers. Data brokers double down on transparency requirements, even as no one knows who they are. Other companies collect detailed behavioral data about their own customers “just because.” Companies promise control over precise geolocation information, yet collect and sell location data with abandon. Often, companies insist these activities are ultimately about providing better advertising, but this ignores how data is used to discriminate in advertising, excluding older workers from even seeing job ads or targeting “abortion-minded women” visiting health clinics.

The Trump administration must also work with Congress to enact meaningful privacy legislation that addresses these harms. The coalition legislative principles follow last Friday’s comment deadline from the National Telecommunications & Information Administration (NTIA), which is engaged in an effort to develop an administration-wide approach to privacy. In our comments, CDT has called on the NTIA to put forward a concrete legislative proposal. We appreciate the NTIA’s recognition that companies must embrace longstanding Fair Information Practice Principles (FIPPs), as well as internal accountability and risk management efforts, but voluntary frameworks and internal corporate processes are insufficient to protect our privacy.

CDT’s comments underscore concerns we have with relying on staffing, privacy by design procedures, or internal privacy risk assessments as the primary basis of privacy protection. While shifting the responsibility for protecting data away from individuals and proactively onto companies is a laudable goal, accountability and risk management relies on the internal privacy values of businesses. Absent a firm set of legislative rules, risk management still gives companies considerable discretion to determine what risks individuals should assume.

We also lack a shared consensus around what privacy risk even is. To the extent that risk management becomes part of the administration’s proposal, CDT recommends adopting the set of the risks compiled by the National Institute for Standards & Technology (NIST). NIST acknowledges that privacy risks exist beyond economic loss and include diminished capacity for autonomy and self-determination, discrimination (legal or otherwise), and a generalized loss of trust. An even more extensive framing of risk is present in a legislative discussion draft from Intel which includes (1) psychological harm, (2) significant inconvenience and loss of time, (3) adverse eligibility determinations, (4) stigmatization and reputational harm, (5) unwanted commercial communications, (6) price discrimination, and (7) other effects that alter experiences, limit choices, or influence individuals in addition to expected financial or physical harms.

Developing meaningful privacy protection requires addressing broader equity and fairness concerns raised by new technologies. A bigger challenge for a federal privacy framework is how to address the risks from opaque and discriminatory algorithms. Applications of artificial intelligence and machine learning present the difficult test for privacy and are an extensive focus of the EU General Data Protection Regulation (GDPR).

Private debates must resolve how ubiquitous data flows generate information and power asymmetries that benefit companies at individual expense. This may also require the NTIA to consider how privacy norms are shaped by user experience and user interface, as well as so-called “dark patterns.” Privacy management now goes beyond what individuals can reasonably control, with spillover effects that impact the public at large. At a minimum, we recommend the NTIA solicit the views of additional perspectives across civil rights organizations to ensure it crafts privacy protections that address concerns of marginalized and vulnerable communities.

Personal information is an evolving concept, and the NTIA must clarify what the scope of information is covered under its approach to privacy. A broad definition of personal information is appropriate in today’s digital environment to protect consumers and capture evolving business practices that undermine privacy, and CDT endorses the test adopted by the Federal Trade Commission, as well as the Office of Management & Budget, that considers information to be personal data where it can be linked or made reasonably linkable to an individual.

We also urge careful consideration of what sort of exemptions de-identified and other types of anonymous data should be subjected to under a federal privacy law. De-identification is a valuable process for protecting privacy, but CDT would suggest it is time to reassess what reasonable de-identification measures should entail and to acknowledge the growing sorts of data misuse that occur with aggregated and traditionally non-personal information.

In the end, the United States needs specific rights that are created through legislative action in Congress. Federal law needs to go beyond giving individuals more notices and choices about their privacy, and we think it is time for legislators in Congress to flip the privacy presumption and declare some data practices unfair. As CDT recently testified before Congress, we are advocating for a legislative solution that (1) grants affirmative individual rights to data, including access, correction, deletion, and portability, (2) requires reasonable security practices and transparency from companies, (3) prevents advertising discrimination against protected classes, and (4) presumptively prohibits certain unfair data practices. These protections must be backed by robust enforcement.

CDT hopes the administration champions this approach, and as the public interest privacy legislation principles demonstrate, there are many organizations that stand ready to work with the NTIA and Congress to propose concrete language to these ends.

CDT’s comments to the NTIA are available here, and you can read the Public Interest Privacy Legislation Principles here.

 

The post CDT Signs Onto Principles for Privacy Legislation, Calls On NTIA to Promote Robust Privacy Law in Congress appeared first on Center for Democracy & Technology.

Categories: Privacy

EFF, Human Rights Watch y más de 70 grupos de la sociedad civil solicitan a Mark Zuckerberg que proporcione a todos los usuarios y usuarias un mecanismo para apelar ante la censura de contenidos en Facebook

EFF News - Tue, 2018-11-13 10:53

La libertad de expresión del mundo está en sus manos, dicen estos grupos al director ejecutivo de Facebook

English version

San Francisco - The Electronic Frontier Foundation y más de 70 grupos de derechos humanos y digitales pidieron hoy a Mark Zuckerberg que añadiera transparencia y responsabilidad real al proceso de eliminación de contenidos de Facebook. Específicamente, los grupos exigen que Facebook explique – claramente -  cuánto contenido elimina, correcta o incorrectamente, y que proporcione a la totalidad de sus usuarios un método justo y oportuno para apelar estas eliminaciones y ver la restitución de su contenido.

Mientras que Facebook ya está bajo una enorme -y creciente - presión para eliminar material que es verdaderamente amenazante, sin transparencia, imparcialidad y procesos para identificar y corregir errores, las políticas de eliminación de contenidos de Facebook con demasiada frecuencia resultan contraproducentes y silencian a las mismas personas que deberían poder hacer oír sus voces en la plataforma.

Políticos, museos, celebridades y otros grupos e individuos de alto perfil cuyo contenido removido indebidamente puede atraer la atención de los medios de comunicación, pueden tener pocos problemas para llegar a Facebook y recuperar el contenido; a veces incluso reciben una disculpa. ¿Pero el o la usuario/a medio? No tanto. Facebook solo permite a la gente apelar las decisiones sobre contenido en un conjunto limitado de circunstancias, y en muchos casos, los usuarios no tienen absolutamente ninguna opción para apelar. Onlinecensorship.org, un proyecto de EFF en el que cualquier usuario puede denunciar avisos de retirada de contenido, ha recopilado informes de cientos de incidentes de eliminaciones injustificadas en los que no se disponía de recursos. Para la mayoría de usuarios y usuarias, el contenido que Facebook elimina rara vez se restaura, y algunas personas son expulsados de la plataforma sin una buena razón.

EFF, Artículo 19, el Centro para la Democracia y la Tecnología, y Ranking Digital Rights escribieron hoy directamente a Mark Zuckerberg exigiendo que Facebook implemente estándares basados en el sentido común para que los usuarios y usuarias promedio puedan fácilmente apelar las decisiones de moderación de contenido, recibir respuestas rápidas y revisiones oportunas por parte de una persona o personas, y la oportunidad de presentar evidencia durante el proceso de revisión. La carta fue firmada conjuntamente por más de 70 organizaciones de derechos humanos, derechos digitales y libertades civiles de Sudamérica, Europa, Oriente Medio, Asia y África.

"No hace falta ser alguien famoso ni aparecer en los titulares para que Facebook responda a las malas decisiones de moderación de contenido, pero eso es exactamente lo que está ocurriendo", dijo la directora de Libertad de Expresión Internacional de la EFF, Jillian York. "Mark Zuckerberg creó una compañía que es la plataforma de comunicaciones más importante del mundo. Tiene una responsabilidad con todos los usuarios, no solo con aquellos que pueden hacer más ruido y hacer que la compañía – potencialmente –se vea mal".

Además de implementar un proceso de apelación significativo,  EFF y sus socios pidieron al Sr. Zuckerberg la emisión de informes de transparencia sobre la aplicación de las normas comunitarias que incluyeran un desglose del tipo de contenido que ha sido restringido, datos sobre cómo se iniciaron las acciones de moderación de contenido y el número de decisiones que fueron apeladas y  que a raíz de esto,  se demostró que fueron un error.

"Facebook está muy por detrás de sus competidores cuando se trata de transparencia y responsabilidad en las decisiones de censura de contenidos", dijo Nate Cardozo, Asesor Senior de Seguridad de la Información de EFF. "Le pedimos al Sr. Zuckerberg que implemente los Principios de Santa Clara, y que publique números reales detallando la frecuencia con la que Facebook elimina el contenido y la frecuencia con la que lo hace de forma incorrecta".

"Sabemos que las políticas de moderación de contenido se están aplicando de manera desigual y que una enorme cantidad de contenido se está eliminando de forma inadecuada cada semana. Pero no tenemos cifras o datos que puedan decirnos qué tan grande es el problema, qué contenido se ve más afectado y cómo se trataron las apelaciones", dijo Cardozo. "El Sr. Zuckerberg debería hacer transparente estas decisiones, que afectan a millones de personas en todo el mundo, una prioridad en Facebook."

La carta:
https://santaclaraprinciples.org/open-letter-spanish/

Los Principios de Santa Clara:
https://santaclaraprinciples.org/

Para más información sobre la censura privada:
https://www.eff.org/deeplinks/2018/09/platform-censorship-lessons-copyright-wars
https://www.eff.org/deeplinks/2018/04/smarter-privacy-rules-what-look-what-avoid

Contact:  Jillian C.YorkDirector for International Freedom of Expressionjillian@eff.org NateCardozoSenior Information Security Counselnate@eff.org

Categories: Privacy

Federal Researchers Complete Second Round of Problematic Tattoo Recognition Experiments

EFF News - Tue, 2018-11-13 10:42

Despite igniting controversy over ethical lapses and the threat to civil liberties posed by its tattoo recognition experiments the first time around, the National Institute of Standards and Technology (NIST) recently completed its second major project evaluating software designed to reveal who we are and potentially what we believe based on our body art.

Unsurprisingly, these experiments continue to be problematic.

The latest experiment was called Tatt-E, which is short for “Tattoo Recognition Technology Evaluation.” Using tattoo images collected by state and local law enforcement from incarcerated people, NIST tested algorithms created by state-backed Chinese Academy of Sciences and MorphoTrak, a subsidiary of the French corporation Idemia.

According to the Tatt-E results, which were published in October, the best-performing tattoo recognition algorithm by MorphoTrak had 67.9% accuracy in matching separate images of tattoo to each other on the first try.

NIST further tested the algorithms on 10,000 images downloaded from Flickr users by Singaporean researchers, even though it was not part of the original scope of Tatt-E. These showed significantly improved accuracy, as high as 99%.

Tattoo recognition technology is similar to other biometric technologies such as face recognition or iris scanning: an algorithm analyzes an image of a tattoo and attempts to match it to a similar tattoo or image in a database. But unlike other forms of biometrics, tattoos are not only a physical feature but a form of expression, whether it is a cross, a portrait of a family member, or the logo for someone’s favorite band.

Since 2014, the FBI has sponsored NIST’s tattoo recognition project to advance this emerging technology. In 2016, an EFF investigation revealed that NIST had skipped over key ethical oversight processes and privacy protections with its earlier experiments called Tatt-C, which is short for the Tattoo Recognition Challenge. This experiment promoted using tattoo recognition technology to investigate people’s beliefs and memberships, including their religion. The more recent Tatt-E, however, did not test for “tattoo similarity”—the ability to match tattoos that are similar in theme in design, but belong to different people.

A database of images captured from incarcerated people was provided to third parties—including private corporations and academic institutions—with little regard for the privacy implications. After EFF called out NIST, the agency retroactively altered its presentations and reports, including eliminating problematic information and replacing images of inmate tattoos in a “Best Practices” poster with topless photos of a researcher with marker drawn all over his body. The agency also pledged to implement new oversight procedures.

However, transparency is lacking. Last November, EFF filed suit against NIST and the FBI after the agencies failed to provide records in response to our Freedom of Information Act requests. So far the records we have freed have revealed how the FBI is seeking to develop a mobile app that can recognize the meaning of tattoos and the absurd methods NIST use to adjust its “Best Practices” documents.  Yet, our lawsuit continues, as the agency continues to withhold records and redacted much of the documents they have produced.

Tatt-E was the latest set of experiments conducted by NIST. Unlike Tatt-C, which involved 19 entities, only two entities chose to participate in Tatt-E, each of which has foreign ties. Both the Chinese Academy of Sciences and MorphoTrak submitted six algorithms for testing against a dataset of tattoo images provided by the Michigan State Police and the Pinellas County Sheriff’s Office in Florida.

MorphoTrak’s algorithms significantly outperformed the Chinese Academy of Sciences’, which may not be surprising since the company’s software has been used with the Michigan State Police’s tattoo database for more than eight years. Its best algorithm could return a positive match within the first 10 images 72.1% of the time, and that number climbed to 84.8% if researchers cropped the source image down to just the tattoo. The accuracy in the first 10 images increased to 95% if they used the infrared spectrum. In addition, the CAC algorithms performed poorly with tattoos on dark skin, although the skin tone did not make much of a difference for MorphoTrak’s software.

One of the more concerning flaws in the research is that NIST did not document “false positives.” This is when the software says it has matched two tattoos, but the match turns out to be in error. Although this kind of misidentification has been a perpetual problem with face recognition, the researchers felt that it was not useful to the study. In fact, they suggest that false positives a may have “investigative utility in operations.” While they don’t explain exactly what this use case might be, from other documents produced by NIST we can infer they are likely discussing how similar tattoos on different people could establish connections among their wearers.

While Tatt-E was supposed to be limited to images collected by law enforcement, NIST went a step further and used the Nanyang Technological University Tattoo Database, which was compiled from images taken from Flickr users, for further research. With this dataset, the Chinese Academy of Sciences' algorithms performed better, hitting at high as 99.3% accuracy.

No matter the accuracy in identification, tattoo recognition raises serious concerns for our freedoms. As we’ve already seen, improperly interpreted tattoos have been used to brand people as gang members and fast track them for deportation. EFF urges NIST to make Tatt-E its last experiment with this technology.

Related Cases: NIST Tattoo Recognition Technology Program FOIA

Categories: Privacy

EPIC Supports Constitutionality of "Robocall" Law

EPIC - Tue, 2018-11-13 09:50

EPIC has filed a "friend of the court" brief in a case concerning the constitutionality of the Telephone Consumer Protection Act, the law the prohibits unwanted "robocalls." In Gallion v. Charter Communications, EPIC argued that "the TCPA prohibitions are needed now more than ever," citing the intrusiveness of marketing calls directed toward cell phones. EPIC also said the TCPA "protects important consumer privacy interests." EPIC testified in support of the TCPA and has submitted extensive comments and amicus briefs on the consumer privacy law.

Categories: Privacy

EFF, Human Rights Watch, and Over 70 Civil Society Groups Ask Mark Zuckerberg to Provide All Users with Mechanism to Appeal Content Censorship on Facebook

EFF News - Tue, 2018-11-13 08:59

World’s Freedom of Expression Is In Your Hands, Groups Tell CEO

Spanish version 

San Francisco—The Electronic Frontier Foundation (EFF) and more than 70 human and digital rights groups called on Mark Zuckerberg today to add real transparency and accountability to Facebook’s content removal process. Specifically, the groups demand that Facebook clearly explain how much content it removes, both rightly and wrongly, and provide all users with a fair and timely method to appeal removals and get their content back up.

While Facebook is under enormous—and still mounting—pressure to remove material that is truly threatening, without transparency, fairness, and processes to identify and correct mistakes, Facebook’s content takedown policies too often backfire and silence the very people that should have their voices heard on the platform.

Politicians, museums, celebrities, and other high profile groups and individuals whose improperly removed content can garner media attention seem to have little trouble reaching Facebook to have content restored—they sometimes even receive an apology. But the average user? Not so much. Facebook only allows people to appeal content decisions in a limited set of circumstances, and in many cases, users have absolutely no option to appeal. Onlinecensorship.org, an EFF project for users to report takedown notices, has collected reports of hundreds of unjustified takedown incidents where appeals were unavailable. For most users, content Facebook removes is rarely restored, and some are banned from the platform for no good reason.

EFF, Article 19, the Center for Democracy and Technology, and Ranking Digital Rights wrote directly to Mark Zuckerberg today demanding that Facebook implement common sense standards so that average users can easily appeal content moderation decisions, receive prompt replies and timely review by a human or humans, and have the opportunity to present evidence during the review process. The letter was co-signed by more than 70 human rights, digital rights, and civil liberties organizations from South America, Europe, the Middle East, Asia, Africa, and the U.S.

“You shouldn’t have to be famous or make headlines to get Facebook to respond to bad content moderation decisions, but that’s exactly what’s happening,” said EFF Director for International Freedom of Expression Jillian York. “Mark Zuckerberg created a company that’s the world’s premier communications platform. He has a responsibility to all users, not just those who can make the most noise and potentially make the company look bad.”

In addition to implementing a meaningful appeals process, EFF and partners called on Mr. Zuckerberg to issue transparency reports on community standards enforcement that include a breakdown of the type of content that has been restricted, data on how the content moderation actions were initiated, and the number of decisions that were appealed and found to have been made in error.

“Facebook is way behind other platforms when it comes to transparency and accountability in content censorship decisions,” said EFF Senior Information Security Counsel Nate Cardozo. “We’re asking Mr. Zuckerberg to implement the Santa Clara Principles, and release actual numbers detailing how often Facebook removes content—and how often it does so incorrectly.”

“We know that content moderation policies are being unevenly applied, and an enormous amount of content is being removed improperly each week. But we don’t have numbers or data that can tell us how big the problem is, what content is affected the most, and how appeals were dealt with,” said Cardozo. “Mr. Zuckerberg should make transparency about these decisions, which affect millions of people around the world, a priority at Facebook.”

For the letter:
https://santaclaraprinciples.org/open-letter/

For the Santa Clara Principles:
https://santaclaraprinciples.org/

For more information on private censorship:
https://www.eff.org/deeplinks/2018/09/platform-censorship-lessons-copyright-wars
https://www.eff.org/deeplinks/2018/04/smarter-privacy-rules-what-look-what-avoid

Contact:  Jillian C.YorkDirector for International Freedom of Expressionjillian@eff.org NateCardozoSenior Information Security Counselnate@eff.org

Categories: Privacy

EFF to U.S. Department of Commerce: Protect Consumer Data Privacy

EFF News - Mon, 2018-11-12 18:59

On Friday, November 9, 2018, EFF submitted a letter in response to the U.S. Department of Commerce's request for comment on "Developing the Administration's Approach to Consumer Privacy," urging the agency to consider any future policy proposals in a users' rights framework. We emphasized five concrete recommendations for any Administration policy proposal or proposed legislation regarding the data privacy rights of users online:

  1. Requiring opt-in consent to online data gathering
  2. Giving users a “right to know” about data gathering and sharing
  3. Giving users a right to data portability
  4. Imposing requirements on companies for when customer data is breached
  5. Requiring businesses that collect personal data directly from consumers to serve as “information fiduciaries,” similar to the duty of care required of certified personal accountants.

But, to be clear, any new federal data privacy regulation or statute must not preempt stronger state data privacy rules. For example, on June 28, California enacted the Consumer Privacy Act (S.B. 375) (“CCPA”). Though there are other examples, the CCPA is the most comprehensive state-based data privacy law, and while it could be improved, its swift passage highlights how state legislators are often in the best position to respond to the needs of their constituents. While baseline federal privacy legislation would benefit consumers across the country, any federal privacy regulation or legislation that preempts and supplants state action would actually hurt consumers and prevent states from protecting the needs of their constituents.

It is also important that any new regulations must be judicious and narrowly tailored, avoiding tech mandates and expensive burdens that would undermine competition—already a problem in some tech spaces--or infringe on First Amendment rights. To accomplish that, policymakers must start by consulting with technologists as well as lawyers. Also, one size does not fit all: smaller entities should be exempted from some data privacy rules.

EFF welcomes the opportunity to discuss with the Department of Commerce these or any other issues regarding federal data privacy policy, federal privacy legislation, or state privacy legislation.

Categories: Privacy

EPIC Comments on NTIA’s Consumer Privacy Framework

EPIC - Fri, 2018-11-09 21:18

EPIC submitted comments to the National Telecommunications and Information Administration—the agency that advises the White House on Internet policy—on the proposed framework for consumer privacy. EPIC backed the "Desired Outcomes:" (1) transparency, (2) control, (3) minimization, (4) security, (5) access and correction, (6) risk management, and (7) accountability. But EPIC urged the agency to support federal baseline legislation, the creation of a data protection agency, and the ratification of the International Privacy Convention. EPIC explained, "These are not policy preferences or partisan perspectives. These are the steps that modern societies must take to safeguard the personal data of their citizens.” NTIA Secretary David Redl met with the Privacy Coalition last month.

Categories: Privacy

Wed, 1969-12-31 19:00

Categories: , Privacy