You are hereFeed aggregator / Categories / Privacy

Privacy


San Francisco: Building Community Broadband to Protect Net Neutrality and Online Privacy

EFF News - Fri, 2018-02-23 18:36

Like many cities around the country, San Francisco is considering an investment in community broadband infrastructure: high-speed fiber that would make Internet access cheaper and better for city residents. Community broadband can help alleviate a number of issues with Internet access that we see all over America today. Many Americans have no choice of provider for high-speed Internet, Congress eliminated user privacy protections in 2017, and the FCC decided to roll back net neutrality protections in December.

This week, San Francisco published the recommendations of a group of experts, including EFF’s Kit Walsh, regarding how to protect the privacy and speech of those using community broadband.

This week, the Blue Ribbon Panel on Municipal Fiber released its third report, which tackles competition, security, privacy, net neutrality, and more. It recommends San Francisco’s community broadband require net neutrality and privacy protections. Any ISP looking to use the city’s infrastructure would have to adhere to certain standards. The model of community broadband that EFF favors is sometimes called “dark fiber” or “open access.” In this model, the government invests in fiber infrastructure, then opens it up for private companies to compete as your ISP. This means the big incumbent ISPs can no longer block new competitors from offering you Internet service. San Francisco is pursuing the “open access” option, and is quite far along in its process.

The “open access” model is preferable to one in which the government itself acts as the ISP, because of the civil liberties risks posed by a government acting as your conduit to information.

Of course, private ISPs can also abuse your privacy and restrict your opportunities to speak and learn online.

To prevent such harms, the expert panel explained how the city could best operate its network so that competition, as well as legal requirements, would prevent ISPs from violating net neutrality or the privacy of residents.

That would include, as was found in the 2015 Open Internet Order recently repealed by the FCC, a ban on blocking of sites, content, or applications; a ban on throttling sites, content, or applications; and a ban on paid prioritization, where ISPs favor themselves or companies who have paid them by giving their content better treatment.

The report also recommends requiring a number of consumer protections that Congress prevented from ever being enacted. If an ISP wants to sell or show a customer’s personal information to anyone, they’d have to give permission first. Even the use of data that doesn’t identify someone would require permission. Both of these would have to be “opt-in,” so it would be assumed that there was no consent to use the data. (“Opt-out” would mean that using customer data is assumed to be fine unless that customer figured out how to tell them no.)

Furthermore, the goal is to build infrastructure that connects every home and business to a fiber optic network, guaranteeing everyone in the city access to fast, reliable Internet. And while the actual lines will be owned by the city, it will be an “open-access” model—that is, space on the city-owned lines will be leased to private companies, creating competition and choice.

The report also recommends that San Francisco require ISPs to protect privacy when faced with legal challenges or demands from government agencies. It recommends San Francisco require ISPs using its network do a number of things (e.g., give up the right to look at customer communications, give up the right to consent to searches of communications, and swear to—if not prohibited by law—tell customers when they’re being asked to hand over information) to help protect the civil liberties and privacy of users.

With all of these things combined, San Francisco’s community broadband looks to be doing as much as possible to provide choices while also ensuring that all their options lead to safe and secure connection to a free and open Internet. That’s something we can all work towards in our communities.

Categories: Privacy

The Federal Circuit Should Not Allow Patents on Inventions that Should Belong to the Public

EFF News - Fri, 2018-02-23 16:19

One of the most fundamental aspects of patent law is that patents should only be awarded for new inventions. That is, not only does someone have to invent something new to them in order to receive a patent, is must also be a new to the world. If someone independently comes up with an idea, it doesn’t mean that person should get a patent if someone else already came up with the same idea and told the public.

There’s good reason for this: patents are an artificial restraint on trade. They work to increase costs (the patent owner is rewarded with higher prices) and can impede follow-on innovation. Policy makers generally try to justify what would otherwise be considered a monopoly through the argument that without patents, inventors may never have invested in research or might not want to make their inventions public. Thus, the story goes, we should give people limited monopolies in the hopes that overall, we end up with more innovation (whether this is actually true, particularly for software, is debatable).

A U.S. Court of Appeals for the Federal Circuit rule, however, upends the patent bargain and allows a second-comer—someone who wasn’t the first inventor—to get a patent under a particular, albeit fairly limited, circumstance. A new petition challenges this rule, and EFF has filed an amicus brief  in support of undoing the Federal Circuit’s misguided rule.

The rule is based on highly technical details of the Patent Act, which you can read about in our brief along with those of Ariosa (the patent challenger) and a group of law professors (not yet available). Our brief argues that the Federal Circuit rule is an incorrect understanding of the law. We ask the Federal Circuit to rehear the issue with the full court, and reverse its current rule.

While the Federal Circuit rule is fairly limited and doesn’t arise in many situations, we have significant concerns about the policy it seems to espouse. Contrary to decades of Supreme Court precedent, the rule allows, under certain circumstances, someone to get a patent on something had already been disclosed to the public. We believe that is always bad policy.

Categories: Privacy

FOSTA Would Be a Disaster for Online Communities

EFF News - Thu, 2018-02-22 19:41

Frankenstein Bill Combines the Worst of SESTA and FOSTA. Tell Your Representative to Reject New Version of H.R. 1865.

The House of Representatives is about to vote on a bill that would force online platforms to censor their users. The Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA, H.R. 1865) might sound noble, but it would do nothing to stop sex traffickers. What it would do is force online platforms to police their users’ speech more forcefully than ever before, silencing legitimate voices in the process.

Back in December, we said that while FOSTA was a very dangerous bill, its impact on online spaces would not be as broad as the Senate bill, the Stop Enabling Sex Traffickers Act (SESTA, S. 1693). That’s about to change.

The House Rules Committee is about to approve a new version of FOSTA [.pdf] that incorporates most of the dangerous components of SESTA. This new Frankenstein’s Monster of a bill would be a disaster for Internet intermediaries, marginalized communities, and even trafficking victims themselves.

If you don’t want Congress to undermine the online communities we all rely on, please take a moment to call your representative and urge them to oppose FOSTA.

Take Action

Stop FOSTA

Gutting Section 230 Is Not a Solution

The problem with FOSTA and SESTA isn’t a single provision or two; it’s the whole approach.

FOSTA would undermine Section 230, the law protecting online platforms from some types of liability for their users’ speech. As we’ve explained before, the modern Internet is only possible thanks to a strong Section 230. Without Section 230, most of the online platforms we use would never have been formed—the risk of liability for their users’ actions would have simply been too high.

Section 230 strikes an important balance for when online platforms can be held liable for their users’ speech. Contrary to FOSTA supporters’ claims, Section 230 does nothing to protect platforms that break federal criminal law. In particular, if an Internet company knowingly engages in the advertising of sex trafficking, the U.S. Department of Justice can and should prosecute it. Additionally, Internet companies are not immune from civil liability for user-generated content if plaintiffs can show that a company had a direct hand in creating the illegal content.

The new version of FOSTA would destroy that careful balance, opening platforms to increased criminal and civil liability at both the federal and state levels. This includes a new federal sex trafficking crime targeted at web platforms (in addition to 18 U.S.C. § 1591)—but which would not require a platform to have knowledge that people are using it for sex trafficking purposes. This also includes exceptions to Section 230 for state law criminal prosecutions against online platforms, as well as civil claims under federal law and civil enforcement of federal law by state attorneys general.

Perhaps most disturbingly, the new version of FOSTA would make the changes to Section 230 apply retroactively: a platform could be prosecuted for failing to comply with the law before it was even passed.

FOSTA Would Chill Innovation

Together, these measures would chill innovation and competition among Internet companies. Large companies like Google and Facebook may have the budgets to survive the massive increase in litigation and liability that FOSTA would bring. They may also have the budgets to implement a mix of automated filters and human censors to comply with the law. Small startups don’t. And with the increased risk of litigation, it would be difficult for new startups ever to find the funding they need to compete with Google.

Today’s large Internet companies would not have grown to prominence without the protections of Section 230. FOSTA would raise the ladder that has allowed those companies to grow, making it very difficult for newcomers ever to compete with them.

FOSTA Would Censor Victims

Congress should think long and hard before dismantling the very tools that have proven most effective in fighting trafficking.

More dangerous still is the impact that FOSTA would have on online speech. Facing the threat of extreme criminal and civil penalties, web platforms large and small would have little choice but to silence legitimate voices. Supporters of SESTA and FOSTA pretend that it’s easy to distinguish online postings related to sex trafficking from ones that aren’t. It’s not—and it’s impossible at the scale needed to police a site as large as Facebook or Reddit. The problem is compounded by FOSTA’s expansion of federal prostitution law. Platforms would have to take extreme measures to remove a wide range of postings, especially those related to sex.

Some supporters of these bills have argued that platforms can rely on automated filters in order to distinguish sex trafficking ads from legitimate content. That argument is laughable. It’s difficult for a human to distinguish between a legitimate post and one that supports sex trafficking; a computer certainly could not do it with anything approaching 100% accuracy. Instead, platforms would have to calibrate their filters to over-censor. When web platforms rely too heavily on automated filters, it often puts marginalized voices at a disadvantage.

Most tragically of all, the first people censored would likely be sex trafficking victims themselves. The very same words and phrases that a filter would use to attempt to delete sex trafficking content would also be used by victims of trafficking trying to get help or share their experiences.

There are many, many stories of traffickers being caught by law enforcement thanks to clues that police officers and others found on online platforms. Congress should think long and hard before dismantling the very tools that have proven most effective in fighting trafficking.

FOSTA Is the Wrong Approach

There is no amendment to FOSTA that would make it effective at fighting online trafficking while respecting the civil liberties of everyone online. That’s because the problem with FOSTA and SESTA isn’t a single provision or two; it’s the whole approach.

Creating more legal tools to go after online platforms would not punish sex traffickers. It would punish all of us, wrecking the safe online communities that we use every day. And in the process, it would also undermine the tools that have proven most effective at putting traffickers in prison. FOSTA is not the right solution, and no trimming around the edges will make it the right solution.

If you care about protecting the safety of our online communities—if you care about protecting everyone’s right to speak online, even about sensitive topics—we urge you to call your representative today and tell them to reject FOSTA.

Take Action

Stop FOSTA

Categories: Privacy

The FCC’s Net Neutrality Order Was Just Published, Now the Fight Really Begins

EFF News - Thu, 2018-02-22 16:26

Today, the FCC’s so-called “Restoring Internet Freedom Order,” which repealed the net neutrality protections the FCC had previously created with the 2015 Open Internet Order, has been officially published. That means the clock has started ticking on all the ways we can fight back.

While the rule is published today, it doesn’t take effect quite yet. ISPs can’t start blocking, throttling, or paid prioritization for a little while. So while we still have the protections of the 2015 Open Internet Order and we finally have a published version of the “Restoring Internet Freedom Order,” it’s time to act.

First, under the Congressional Review Act (CRA), Congress can reverse a change in regulation with a simple majority vote. That would bring the 2015 Open Internet Order back into effect. Congress has 60 working days—starting from when the rule is published in the official record—to do this. So those 60 days start now.

The Senate bill has 50 supporters, only one away from the majority it needs to pass. The House of Representatives is a bit further away. By our count, 114 representatives have made public commitments in support of voting for a CRA action. Now that time is ticking down for the vote, tell Congress to save the existing net neutrality rules.

Second, it is now unambiguous that the lawsuits of 22 states, public interest groups, Mozilla, and the Internet Association can begin. While the FCC decision said lawsuits had to wait ten days until after the official publication, there was some question about whether federal law said something else. So while some suits have already been filed, with the 10-day counter from the FCC starting, it’s clear that lawsuits can begin.

And, of course, states and other local governments continue to move forward on their own measures to protect net neutrality. 26 state legislatures are considering net neutrality legislation and five governors have issued executive orders on net neutrality. EFF has some ideas on how state law can stand up to the FCC order. Community broadband can also ensure that net neutrality principles are enacted on a local level. For example, San Francisco is currently looking for proposals to build an open-access network that would require net neutrality guarantees from any ISP looking to offer services over the city-owned infrastructure.

So while the FCC’s vote in December was in direct contradiction to the wishes of the majority of Americans, the publishing of that order means that action can really start to be taken.

Categories: Privacy

Embedded Tweets and Display Rights: Dangerous Legal Ground for the Web

CDT - Thu, 2018-02-22 11:28

In a troubling recent decision (Goldman v. Breitbart) a court in the Southern District of New York found that embedding an image from Twitter in a web page hosted by a news sites can infringe on the exclusive right of the photographer to control the public display of the image. In the case, photographer Justin Goldman said that new sites, including Breitbart, infringed on this right when they included an embedded image of a tweet that contained a photograph he took of Patriots quarterback Tom Brady in the Hamptons.

Putting aside the fact that the photographer could have used a more direct method of controlling the use of his image, like a DMCA 512 take-down notice, interpreting an embed (or even worse, a hyperlink) as a form of  copyright infringement is a bad idea. The various forms of hyperlinking, including embedded content, form the connective tissue of the web. They provide instant connections between separate sites and various forms of content, allowing sites to offer a more streamlined experience and layered information. They are a crucial part of what makes the web open and accessible, which is why judges and policy makers should exercise caution when changing the legal landscape for links.

Disregarding the way technology works is a flawed approach to legal reasoning. For copyright and technology, the details matter.

To help understand why the recent ruling in Goldman is so troubling, here’s a brief explanation of how embedded links work. When you use a browser to access a website, the browser basically reads a set of instructions from the server hosting the website about where to find the bits and pieces of the site and how to assemble them. Some of those pieces will probably be stored on the same server as the website’s instructional code. Other pieces may be stored elsewhere, on different servers. The assembly instructions for websites with embedded content tell the browser where to look for the content (using a uniform resource locator or URL), and where to put it on the page. The browser then locates the proper server and asks it to send the file for the embedded content. If it agrees, the server sends the content directly to the browser on the user’s computer. The website’s server has no interaction with or control over the embedded content other than pointing to its address.

In 2007, the 9th Circuit considered whether Google infringed the display rights of photographers with its “framing” feature of Google Image Search. It reasoned that because Google did not host the images on its own server, but rather provided instructions (to browsers) to find the images on servers not under Google’s control, Google was not responsible for infringement. This became known as the “server test.” The legal reasoning applies the language of the Copyright Act to the technical details of how browsers interact with websites. The Court found that, since only the server hosting the linked content had control of the files, it alone was capable of “communicating” the image files and therefore Google’s embedded link did not constitute a “display” under the Act.

In the recent case, Judge Forrest declined to apply the “server test,” finding that “a website’s servers need not actually store a copy of the work in order to ‘display’ it.” But servers don’t “display” embedded files because they can’t “transmit” or “communicate” a file they don’t possess. There is an important technical difference between sending a copy of an image file and sending instructions for where to look for a file. In the latter case, the website sending “embed” instructions has no control over the third-party servers, or the content they host. In this case, the third-party servers were operated by Twitter, which, had they been asked, could have removed or blocked access to the offending tweets. Instead, they were publicly accessible to anyone with the proper URL, including Breitbart reporters.

There is a much smaller technical difference between embedding and linking, which is why the argument that “sending instructions is the same as sending a file” is so troubling. They both consist of instructions, one of which directs the browser to retrieve the content automatically (the embed) and the other one requiring a click (or even just a “hover”). In both cases, the actions are performed by the end user’s browser and the image file is “communicated” by the third party directly to the end user. Although Judge Forrest tries to distinguish hyperlinks as different based on the “volitional” element, it is unclear how variations on hyperlink technology might fit that reasoning. For instance, would a hover-to-show type link satisfy the volitional element? Even just the possibility of infringement liability for links could chill their usage, reducing the utility and the “depth” of the web.

The technical workings of the internet have, in many ways, developed in response to judicial interpretations of copyright law. Moving away from a technically detailed approach to the law’s application creates fundamental problems for the open web, and could discourage innovative new ways to create and share information. This is why technically detailed applications of law like the “server test” make sense; it is based on concrete and verifiable information and keeps the liability on the party with actual control over the copyrighted work. More generally, disregarding the way technology works is a flawed approach to legal reasoning. For copyright and technology, the details matter.

Categories: Privacy

EPIC v. IRS: EPIC Urges D.C. Circuit to Green-Light Release of President Trump's Tax Returns

EPIC - Thu, 2018-02-22 11:25

EPIC has filed the opening brief in its case to obtain President Trump's tax returns. EPIC told the D.C. Circuit Court of Appeals that the IRS has the authority to disclose the President's returns to correct numerous misstatements of fact concerning his financial ties to Russia. For example, President Trump tweeted that "Russia has never tried to use leverage over me. I HAVE NOTHING TO DO WITH RUSSIA - NO DEALS, NO LOANS, NO NOTHING"—a claim "plainly contradicted by his own attorneys, family members, and business partners." A Quinnipiac poll released today confirms that public overwhelmingly supports (67%) the release of the President's returns. As EPIC told the Court, "there has never been a more compelling FOIA request presented to the IRS." EPIC v. IRS is one of several FOIA cases EPIC is pursuing concerning Russian interference in the 2016 Presidential election, including EPIC v. ODNI (scope of Russian interference), EPIC v. FBI (response to Russian cyber attack), and EPIC v. DHS (election cybersecurity). Press Release.

Categories: Privacy

State Progress on Election Cybersecurity

CDT - Wed, 2018-02-21 13:43

The Center for American Progress (CAP) recently released a report titled, “Election Security in All 50 States,” which gave a grade of C or lower to 40 states. In fact, no state received a perfect A grade. While Former Homeland Security Secretary Jeh Johnson also said that many states have done little to nothing to prepare since 2016, we don’t believe that’s true. To the contrary, states are well underway in their preparations for the 2018 midterm elections, which are expected to be under increased scrutiny as a result of Russian influence operations designed to sow doubt and fear in the US election process. Director of National Intelligence Daniel Coats stated, “At a minimum, we expect Russia to continue using propaganda, social media, false-flag personas, sympathetic spokespeople, and other means of influence to try to exacerbate social and political fissures in the United States.” It is important to recognize that states in many cases are doing very important work to make their election infrastructure more resilient. Here we highlight six states that, over the past six months, have made substantial progress in improving the security of their election systems and making them more resistant to foreign influence.

Colorado: Risk-limiting Audits

The Colorado Secretary of State adopted Election Rule 25 to mandate how counties would conduct risk-limiting audits (RLAs) beginning with the November 7, 2017 election. An RLA of the results bridges the gap between wholly trusting vote tabulation machine results and completing a full manual recount of all ballots. The audit involves a manual recount of a random sample of the ballots using statistics to determine with a high level of confidence that that voting machine count is accurate. Implementing RLAs required Colorado to have capable voting machines and significant training in the state’s 64 counties. The RLAs were successfully completed just two weeks after the election.

Illinois: Mandatory Cybersecurity Training

For many election officials and staff, cybersecurity may be a new concept, requiring training and recalibration to the new reality that cybersecurity is everyone’s responsibility. In Illinois, annual cybersecurity training from the Department of Innovation and Technology (DoIT) became mandatory for state employees as of January 1, 2018. DoIT is focused on preventing phishing attacks like those used against Clinton campaign chairman John Podesta, 122 state and local election jurisdictions, and voting machine manufacturers. People will always be the weakest link in the cybersecurity chain. Mandatory annual cybersecurity training will provide the state with the opportunity to reinforce fundamental practices and adapt training to meet new threats.

Rhode Island: Security Risk Assessment

The state issued a Request for Proposals (RFP) for a Security Risk Assessment of its Department of State according to the ISO 27001/27002 or NIST Cybersecurity Framework standard. Such assessments are a key part of understanding how current activities may need to be modified or supplemented to address new threats. The Assessment is slated to be completed by May 2018. It should provide the Rhode Island Secretary of State with a roadmap of gaps and deficiencies as well as remediation options to address them. This systematic approach recognizes the ever-evolving nature of the election security threat landscape.

Washington: Multi-State Information Sharing

As one of the 21 states identified as being targets of Russian hacking attempts, Washington partnered with the Department of Homeland Security (DHS) and the Multi-State Information Sharing & Analysis Center (MS-ISAC) in September. The goal of the three-month pilot project announced in September was to assess vulnerabilities and identify mitigation plans; share information; rely on DHS for local in-person support; and report incidents or threats. Sharing technical and non-technical data about incidents to other states via the MS-ISAC is critical in keeping election officials and DHS informed of potential attacks and defensive measures.

West Virginia: Air National Guard

The West Virginia Secretary of State announced in September a partnership with the Air National Guard to assess election systems and monitor those systems for malicious activity. Under the partnership, an Air National Guard Cyber Systems Operations specialist will be embedded in the Secretary of State’s office, as well as in the West Virginia Intelligence Fusion Center. The benefit of embedding a specialist in those offices is to link the mission of the National Guard with the needs of the election officials and the situational awareness of law enforcement officials monitoring criminal and terrorist activities in the state.

Election security is the process of anticipating and responding to ever-evolving threats in an environment where voter confidence can be swayed just as much by perception as reality. The CAP report and recent news reports only provide a snapshot of where states fall short in their security efforts. Some states, like Colorado, Illinois, Rhode Island, Washington, and West Virginia, are already on their way to improving their election security grade. We would love to hear about other efforts states are engaged in to better defend against, detect, and recover from attacks.

Election Security Grades by State – Center for American Progress (February 2018)

Categories: Privacy

When the Copyright Office Meets, the Future Needs a Seat at the Table

EFF News - Wed, 2018-02-21 12:33

Every three years, EFF's lawyers spend weeks huddling in their offices, composing carefully worded pleas we hope will persuade the Copyright Office and the Librarian of Congress to grant Americans a modest, temporary permission to use our own property in ways that are already legal.

Yeah, we think that's weird, too. But it's been than way ever since 1998, when Congress passed the Digital Millennium Copyright Act, whose Section 1201 established a ban on tampering with "access controls for copyrighted works" (also known as "Digital Rights Management" or "DRM"). It doesn't matter if you want to do something absolutely legitimate, something that there is no law against -- if you have to bypass DRM to do it, it's not allowed.

What's more, if someone wants to provide you with a tool to get around the DRM, they could face up to five years in prison and a $500,000 fine, for a first offense, even if the tool is only ever used to accomplish legal, legitimate ends.

Which brings us back to EFF's lawyers, sweating over their briefs every three years. The US Copyright Office holds proceedings every three years to determine whether it should recommend that the Librarian of Congress grant some limited exemptions to this onerous rule. Every three years, EFF begs for -- and wins -- some of these exemptions, by explaining how something people used to be able to do has been shut down by DMCA 1201 and the DRM it supports.

But you know what we don't get to do? We don't get to ask for the right to break DRM to do things that no one has ever thought of -- at least, that they haven't thought of yet. We don't get to brief the Copyright Office on the harms to companies that haven't been founded yet, the gadgets they haven't designed yet, and the users they haven't attracted yet. Only the past gets a seat at the table: the future isn't welcome.

That's a big problem. Many of the tools and technologies we love today were once transgressive absurdities: mocked for being useless and decried as immoral or even criminal. The absurd transgressors found ways to use existing techologies and products to build new businesses, over the howls of objections from the people who'd come before them.

It's a long and honorable tradition, and without it, we wouldn't have cable TV (reviled as thieves by the broadcasters in their early days); Netflix (called crooks by the Hollywood studios for mailing DVDs around in red envelopes); or iTunes ("Rip, Mix, Burn" was damned as a call to piracy by the record industry).

These businesses exist because they did something that wasn't customary, something rude and disorderly and controversial -- they did things that were legal, but unsanctioned by the businesses they were doing those things to.

And today, as these businesses have reached maturity, the so-called pirates have become admirals. Today, these former disruptors also use DRM and are glad that bypassing their DRM to do something legal is banned (because their shareholders prefer it that way).

Those companies aren't doing themselves any favors, either. Even as Apple was asking the Copyright Office to ban third-party modifications to the iPhone, it was copying these unauthorized innovations and including them in the official versions of its products.

Our Catalog of Missing Devices gives you a sense of what we've lost because DMCA 1201 has given the companies that succeeded last year the right to decide who can compete with them in the years to come.

It's a year that's divisible by three, and that means that EFF is back at the Copyright Office, pleading for the right of the past to go on in the present -- but we can't ask the Copyright Office to protect the future, the DMCA doesn't allow it.

That's why we've sued the US Government to invalidate Section 1201 of the DMCA: Congress made a terrible blunder in 1998 when it created that law, and the effects of that blunder mount with each passing year. We need to correct it -- and the sooner, the better.

Categories: Privacy

Opposing the Mandating of Kill Switches to Address Contraband Cell Phones

CDT - Wed, 2018-02-21 11:52

Citing the potential threat to law enforcement and the general public, correctional facility officials have pushed for the FCC to address the issue of contraband phone use in prisons. In a recent meeting hosted by the FCC, Department of Justice officials and local law enforcement argued for aggressive technological approaches to addressing contraband phones.  

Now, the FCC is considering a mandate for hard kill switches on all wireless devices. This proposal would provide correctional facility officers with the ability to permanently disable (or “brick”) a phone upon request. However, the broad scope of this proposal will create new security vulnerabilities, and the lack of judicial review would violate established protections for due process. CDT has joined our colleagues at the Electronic Frontier Foundation (EFF) in opposing this proposal and expressing our concerns in an ex parte filing to the FCC.

The mandatory installation of a hard kill switch on all wireless devices would create an explicit security vulnerability on every phone.

From a technological perspective, the mandatory installation of a hard kill switch on all wireless devices would create an explicit security vulnerability on every phone. While corrections officers are seeking a method to disable contraband phones within the confines of their facilities, this vulnerability will not exist in a vacuum. It will be difficult to secure, and malicious actors may hijack or create their own hard kill signals, regardless of where the phone is being used.

The use of a hard kill switch also poses serious risks to users when the wrong phone is identified. If a device is misidentified as contraband and subsequently disabled, the owner of the device will be permanently deprived of their device without any warning or explanation. This would represent more than a minor inconvenience for a handful of people–95 percent of Americans now rely upon a cellphone for communication and information. Under these circumstances, the use of a hard kill switch would cut off access to friends, family, and emergency services. Ultimately, this mandate represents an overly broad and severe technological remedy that will only undermine the security and integrity of wireless devices.

The FCC must avoid any technological mandates that would undermine the security of all cell phone users (aka everyone in the country) and needs include due process safeguards in any proposal.

Additionally, the proposal fails to provide any form of judicial review to enable oversight of the process and ensure accuracy. Instead, the process outlined by the Commission would shoehorn providers into the role reserved by judges. Providers would be asked to evaluate whether a request meets the necessary legal criteria without the procedural structure, experience, or institutional authority of the courts. By incorporating judicial review, judges would be empowered to lend their expertise and critically assess claims from law enforcement–potentially providing a valuable check against misidentification.

Most importantly, courts provide legal safeguards and preserve fundamental due process rights. When correctional officials activate the hard kill switch, they are permanently disabling the phone–effectively destroying it and depriving an individual of their property. And although prisoners are entitled to fewer due process protections, the proposal outlined by the FCC may end up disabling phones found outside the correctional facility, placing the legitimate devices of law-abiding individuals at risk.

In working to address the issue of contraband phones in prisons, the FCC must avoid any technological mandates that would undermine the security of all cell phone users (aka everyone in the country) and needs to include due process safeguards in any proposal.

Categories: Privacy

Republican DACA Bill Would Expand Use of Drones, Biometrics

EPIC - Wed, 2018-02-21 09:10

The Secure and Succeed Act (S. Amdt. 1959 to H.R. 2579), sponsored by several Republican Senators, would link DACA with hi-tech border surveillance. Customs and Border Protection would use facial recognition and other biometric technologies to inspect travelers, both US citizens and non-citizens, at airports. The bill also establishes "Operation Phalanx" that instructs the Department of Defense—a military agency—to use drones for domestic surveillance. EPIC has pursued many FOIA cases on border surveillance involving biometrics, drones, and airport body scanners, In a statement to Congress, EPIC warned that "many of the techniques that are proposed to enhance border surveillance have direct implications for the privacy of American citizens."

Categories: Privacy

The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation

EFF News - Tue, 2018-02-20 19:30

In the coming decades, artificial intelligence (AI) and machine learning technologies are going to transform many aspects of our world. Much of this change will be positive; the potential for benefits in areas as diverse as health, transportation and urban planning, art, science, and cross-cultural understanding are enormous. We've already seen things go horribly wrong with simple machine learning systems; but increasingly sophisticated AI will usher in a world that is strange and different from the one we're used to, and there are serious risks if this technology is used for the wrong ends.

Today EFF is co-releasing a report with a number of academic and civil society organizations1 on the risks from malicious uses of AI and the steps that should be taken to mitigate them in advance.

At EFF, one area of particular concern has been the potential interactions between computer insecurity and AI. At present, computers are inherently insecure, and this makes them a poor platform for deploying important, high-stakes machine learning systems. It's also the case that AI might have implications for computer [in]security that we need to think about carefully in advance. The report looks closely at these questions, as well as the implications of AI for physical and political security. You can read the full document here.

Categories: Privacy

EPIC Amicus: Supreme Court to Hear Arguments in Wiretap Act Case

EPIC - Tue, 2018-02-20 17:15

The Supreme Court will hear arguments this week in Dahda v. United States, a case concerning the federal Wiretap Act and the suppression of evidence obtained following an invalid wiretap order. The Wiretap Act requires exclusion of evidence obtained as a result of an invalid order, but a lower court denied suppression in the case even though the order was unlawfully broad. In an amicus brief, EPIC wrote that "it is not for the courts to create textual exceptions" to federal privacy laws. EPIC explained that Congress enacted strict and unambiguous privacy provisions in the Wiretap Act. "If the government wishes a different outcome," EPIC wrote, "then it should go to Congress to revise the statute." EPIC routinely participates as amicus curiae in privacy cases before the Supreme Court, most recently in Byrd v. United States (suspicionless searches of rental cars) and Carpenter v. United States (warrantless searches of cellphone location records).

Categories: Privacy

Did Congress Really Expect Us to Whittle Our Own Personal Jailbreaking Tools?

EFF News - Tue, 2018-02-20 14:46

In 1998, Congress passed the Digital Millennium Copyright Act (DMCA), and profoundly changed the relationship of Americans to their property.

Section 1201 of the DMCA bans the bypassing of "access controls" for copyrighted works. Originally, this meant that even though you owned your DVD player, and even though it was legal to bring DVDs home with you from your European holidays, you weren't allowed to change your DVD player so that it would play those out-of-region DVDs. DVDs were copyrighted works, the region-checking code was an access control, and so even though you owned the DVD, and you owned the DVD player, and even though you were allowed to watch the disc, you weren't allowed to modify your DVD player to play your DVD (which you were allowed to watch).

Experts were really worried about this: law professors, technologists and security experts saw that soon we'd have software—that is, copyrighted works—in all kinds of devices, from cars to printer cartridges to voting machines to medical implants to thermostats. If Congress banned tinkering with the software in the things you owned, it would tempt companies to use that software to create "private laws" that took away your rights to use your property in the way you saw fit. For example, it's legal to use third party ink in your HP printer, but once HP changed its printers to reject third-party ink, they could argue that anything you did to change them back was a violation of the DMCA.

Congress's compromise was to order the Library of Congress and the Copyright Office to hold hearings every three years, in which the public would be allowed to complain about ways in which these locks got in the way of their legitimate activities. Corporations weigh in about why their business interests outweigh your freedom to use your property for legitimate ends, and then the regulators deliberate and create some temporary exemptions, giving the public back the right to use their property in legal ways, even if the manufacturers of their property don't like it.

If it sounds weird that you have to ask the Copyright Office for permission to use your property, strap in, we're just getting started.

Here's where it gets weird: DMCA 1201 allows the Copyright Office to grant "use" exemptions, but not "tools" exemptions. That means that if the Copyright Office likes your proposal, they can give you permission to jailbreak your gadgets to make some use (say, install third-party apps on your phone, or record clips from your DVDs to use in film studies classes), but they can't give anyone the right to give you the tool needed to make that use (law professor and EFF board member Pam Samuelson argues that the Copyright Office can go farther than this, at least some of the time, but the Copyright Office disagrees).

Apparently, fans of DMCA 1201 believe that the process for getting permission to use your own stuff should go like this:

1. A corporation sells you a gadget that disallows some activity, or they push a software update to a gadget you already own to take away a feature it used to have;

2. You and your lawyers wait up to three years, then you write to the Copyright Office explaining why you think this is unfair;

3. The corporation that made your gadget tells the Copyright Office that you're a whiny baby who should just shut up and take it;

4. You write back to the Copyright Office to defend your use;

5. Months later, the Library of Congress gives you a limited permission to use your property (maybe);

And then...

6. You get a degree in computer science, and subject your gadget to close scrutiny to find a flaw in the manufacturer's programming;

7. Without using code or technical information from anyone else (including other owners of the same gadget) you figure out how to exploit that flaw to let you use your device in the way the government just said you could;

8. Three years later, you do it again.

Now, in practice, that's not how it works. In practice, people who want to use their own property in ways that the Copyright Office approves of just go digging around on offshore websites, looking for software that lets them make that use. (For example, farmers download alternative software for their John Deere tractors from websites they think might be maintained by Ukrainian hackers, though no one is really sure). If that software bricks their device, or steals their personal information, they have no remedy, no warranty, and no one to sue for cheating them.

That's the best case.

But often, the Library of Congress makes it even harder to make the uses they're approving. In 2015, they granted car owners permission to jailbreak their cars in order to repair them—but they didn't give mechanics the right to jailbreak the cars they were fixing. That ruling means that you, personally, can fix your car, provided that 1) you know how to fix a car; and 2) you can personally jailbreak the manufacturer's car firmware (in addition to abiding by the other snares in the final exemption language).

In other cases, the Copyright Office limits the term of the exemption as well as the scope: in the 2015 ruling, the Copyright Office gave security researchers the right to jailbreak systems to find out whether they were secure enough to be trusted, but not industrial systems (whose security is very important and certainly needs to be independently verified by those systems' owners!) and they also delayed the exemption's start for a full year, meaning that security researchers would only get two years to do their jobs before they'd have to go back to the Copyright Office and start all over again.

This is absurd.

Congress crafted the exemptions process to create an escape valve on the powerful tool it was giving to manufacturers with DMCA 1201. But even computer scientists don't hand-whittle their own software tools for every activity: like everyone else, they rely on specialized toolsmiths who make software and hardware that is tested, warranted, and maintained by dedicated groups, companies and individuals. The idea that every device in your home will have software that limits your use, and you can only get those uses back by first begging an administrative agency and then gnawing the necessary implement to make that use out of the lumber of your personal computing environment is purely absurd.

The Copyright Office is in the middle of a new rulemaking, and we've sent in requests for several important exemptions, but we're not kidding ourselves here: as important as it is to get the US government to officially acknowledge that DMCA 1201 locks up legitimate activities, and to protect end users, without the right to avail yourself of tools, the exemptions don't solve the whole problem.

That's why we're suing the US government to invalidate DMCA 1201. DMCA 1201 wasn't fit for purpose in 1998, and it has shown its age and contradictions more with each passing year.

Categories: Privacy

Supreme Court Leaves Data Breach Decision In Place

EPIC - Tue, 2018-02-20 13:40

The Supreme Court has denied a petition for a writ of certiorari in Carefirst, Inc. v. Attias, a case concerning standing to sue in data breach cases. Consumers had sued health insurer Carefirst after faulty security practices allowed hackers to obtain 1.1 million customer records. EPIC filed an amicus brief backing the consumers, arguing that if "companies fail to invest in reasonable security measures, then consumers will continue to face harm from data breaches." The federal appeals court agreed with EPIC and held that consumers may sue companies that fail to safeguard their personal data. Carefirst appealed the decision, but the Supreme Court chose not to take the case. EPIC regularly files amicus briefs defending standing in consumer privacy cases, most recently in Eichenberger v. ESPN, where the Ninth Circuit also held for consumers, as well as Gubala v. Time Warner Cable and In re SuperValu Customer Data Security Breach Litigation.

Categories: Privacy

House Draft Data Security Bill Preempts Stronger State Safeguards

EPIC - Fri, 2018-02-16 14:45

Rep. Luetkemeyer (R-MO) and Rep. Maloney (D-NY) circulated a draft bill, the "Data Acquisition and Technology Accountability and Security Act," that would set federal requirements for companies collecting personal data and require prompt breach notification. The Federal Trade Commission, which has often failed to pursue important data breach cases, and state Attorneys General would both be responsible for enforcing the law. The law would only trigger liability if the personal data breached is "reasonably likely to result in identity theft, fraud, or economic loss" and would preempt stronger state data breach laws. Earlier this week, EPIC President Marc Rotenberg testified before the House, calling for comprehensive data privacy legislation that would preserve stronger state laws. Last fall, EPIC testified at a Senate hearing on the Equifax breach, calling it one of the worst in U.S. history.

Categories: Privacy

Mueller Indicts Russian Nationals, Entities for Election Interference

EPIC - Fri, 2018-02-16 14:20

Special Counsel Robert Mueller has indicted thirteen Russian nationals and three Russian entities for interfering in the 2016 U.S. presidential election. "Beginning as early as 2014" the defendants began operations "to interfere with the U.S. political system" and "sow discord," the indictment explains. They also posed as U.S. persons online, reaching "significant numbers of Americans" on social media. EPIC first sought details of the Russians' "multifaceted" influence campaign in January 2017, pursuing release of the complete Intelligence Community assessment on Russian meddling. EPIC President Marc Rotenberg recently highlighted the role of the Russian Internet Research Agency, named in the Mueller indictment, explaining, "Facebook sold advertising to Russian troll farms working to undermine the American political process." EPIC launched a new project on Democracy an Cybersecurity in early 2017 to help preserve democratic institutions.

Categories: Privacy

Wed, 1969-12-31 19:00

Categories: , Privacy