You are hereFeed aggregator / Sources / EFF News

EFF News


Syndicate content
EFF's Deeplinks Blog: Noteworthy news from around the internet
Updated: 7 min 9 sec ago

San Francisco: Building Community Broadband to Protect Net Neutrality and Online Privacy

Fri, 2018-02-23 18:36

Like many cities around the country, San Francisco is considering an investment in community broadband infrastructure: high-speed fiber that would make Internet access cheaper and better for city residents. Community broadband can help alleviate a number of issues with Internet access that we see all over America today. Many Americans have no choice of provider for high-speed Internet, Congress eliminated user privacy protections in 2017, and the FCC decided to roll back net neutrality protections in December.

This week, San Francisco published the recommendations of a group of experts, including EFF’s Kit Walsh, regarding how to protect the privacy and speech of those using community broadband.

This week, the Blue Ribbon Panel on Municipal Fiber released its third report, which tackles competition, security, privacy, net neutrality, and more. It recommends San Francisco’s community broadband require net neutrality and privacy protections. Any ISP looking to use the city’s infrastructure would have to adhere to certain standards. The model of community broadband that EFF favors is sometimes called “dark fiber” or “open access.” In this model, the government invests in fiber infrastructure, then opens it up for private companies to compete as your ISP. This means the big incumbent ISPs can no longer block new competitors from offering you Internet service. San Francisco is pursuing the “open access” option, and is quite far along in its process.

The “open access” model is preferable to one in which the government itself acts as the ISP, because of the civil liberties risks posed by a government acting as your conduit to information.

Of course, private ISPs can also abuse your privacy and restrict your opportunities to speak and learn online.

To prevent such harms, the expert panel explained how the city could best operate its network so that competition, as well as legal requirements, would prevent ISPs from violating net neutrality or the privacy of residents.

That would include, as was found in the 2015 Open Internet Order recently repealed by the FCC, a ban on blocking of sites, content, or applications; a ban on throttling sites, content, or applications; and a ban on paid prioritization, where ISPs favor themselves or companies who have paid them by giving their content better treatment.

The report also recommends requiring a number of consumer protections that Congress prevented from ever being enacted. If an ISP wants to sell or show a customer’s personal information to anyone, they’d have to give permission first. Even the use of data that doesn’t identify someone would require permission. Both of these would have to be “opt-in,” so it would be assumed that there was no consent to use the data. (“Opt-out” would mean that using customer data is assumed to be fine unless that customer figured out how to tell them no.)

Furthermore, the goal is to build infrastructure that connects every home and business to a fiber optic network, guaranteeing everyone in the city access to fast, reliable Internet. And while the actual lines will be owned by the city, it will be an “open-access” model—that is, space on the city-owned lines will be leased to private companies, creating competition and choice.

The report also recommends that San Francisco require ISPs to protect privacy when faced with legal challenges or demands from government agencies. It recommends San Francisco require ISPs using its network do a number of things (e.g., give up the right to look at customer communications, give up the right to consent to searches of communications, and swear to—if not prohibited by law—tell customers when they’re being asked to hand over information) to help protect the civil liberties and privacy of users.

With all of these things combined, San Francisco’s community broadband looks to be doing as much as possible to provide choices while also ensuring that all their options lead to safe and secure connection to a free and open Internet. That’s something we can all work towards in our communities.

Categories: Privacy

The Federal Circuit Should Not Allow Patents on Inventions that Should Belong to the Public

Fri, 2018-02-23 16:19

One of the most fundamental aspects of patent law is that patents should only be awarded for new inventions. That is, not only does someone have to invent something new to them in order to receive a patent, is must also be a new to the world. If someone independently comes up with an idea, it doesn’t mean that person should get a patent if someone else already came up with the same idea and told the public.

There’s good reason for this: patents are an artificial restraint on trade. They work to increase costs (the patent owner is rewarded with higher prices) and can impede follow-on innovation. Policy makers generally try to justify what would otherwise be considered a monopoly through the argument that without patents, inventors may never have invested in research or might not want to make their inventions public. Thus, the story goes, we should give people limited monopolies in the hopes that overall, we end up with more innovation (whether this is actually true, particularly for software, is debatable).

A U.S. Court of Appeals for the Federal Circuit rule, however, upends the patent bargain and allows a second-comer—someone who wasn’t the first inventor—to get a patent under a particular, albeit fairly limited, circumstance. A new petition challenges this rule, and EFF has filed an amicus brief  in support of undoing the Federal Circuit’s misguided rule.

The rule is based on highly technical details of the Patent Act, which you can read about in our brief along with those of Ariosa (the patent challenger) and a group of law professors (not yet available). Our brief argues that the Federal Circuit rule is an incorrect understanding of the law. We ask the Federal Circuit to rehear the issue with the full court, and reverse its current rule.

While the Federal Circuit rule is fairly limited and doesn’t arise in many situations, we have significant concerns about the policy it seems to espouse. Contrary to decades of Supreme Court precedent, the rule allows, under certain circumstances, someone to get a patent on something had already been disclosed to the public. We believe that is always bad policy.

Categories: Privacy

FOSTA Would Be a Disaster for Online Communities

Thu, 2018-02-22 19:41

Frankenstein Bill Combines the Worst of SESTA and FOSTA. Tell Your Representative to Reject New Version of H.R. 1865.

The House of Representatives is about to vote on a bill that would force online platforms to censor their users. The Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA, H.R. 1865) might sound noble, but it would do nothing to stop sex traffickers. What it would do is force online platforms to police their users’ speech more forcefully than ever before, silencing legitimate voices in the process.

Back in December, we said that while FOSTA was a very dangerous bill, its impact on online spaces would not be as broad as the Senate bill, the Stop Enabling Sex Traffickers Act (SESTA, S. 1693). That’s about to change.

The House Rules Committee is about to approve a new version of FOSTA [.pdf] that incorporates most of the dangerous components of SESTA. This new Frankenstein’s Monster of a bill would be a disaster for Internet intermediaries, marginalized communities, and even trafficking victims themselves.

If you don’t want Congress to undermine the online communities we all rely on, please take a moment to call your representative and urge them to oppose FOSTA.

Take Action

Stop FOSTA

Gutting Section 230 Is Not a Solution

The problem with FOSTA and SESTA isn’t a single provision or two; it’s the whole approach.

FOSTA would undermine Section 230, the law protecting online platforms from some types of liability for their users’ speech. As we’ve explained before, the modern Internet is only possible thanks to a strong Section 230. Without Section 230, most of the online platforms we use would never have been formed—the risk of liability for their users’ actions would have simply been too high.

Section 230 strikes an important balance for when online platforms can be held liable for their users’ speech. Contrary to FOSTA supporters’ claims, Section 230 does nothing to protect platforms that break federal criminal law. In particular, if an Internet company knowingly engages in the advertising of sex trafficking, the U.S. Department of Justice can and should prosecute it. Additionally, Internet companies are not immune from civil liability for user-generated content if plaintiffs can show that a company had a direct hand in creating the illegal content.

The new version of FOSTA would destroy that careful balance, opening platforms to increased criminal and civil liability at both the federal and state levels. This includes a new federal sex trafficking crime targeted at web platforms (in addition to 18 U.S.C. § 1591)—but which would not require a platform to have knowledge that people are using it for sex trafficking purposes. This also includes exceptions to Section 230 for state law criminal prosecutions against online platforms, as well as civil claims under federal law and civil enforcement of federal law by state attorneys general.

Perhaps most disturbingly, the new version of FOSTA would make the changes to Section 230 apply retroactively: a platform could be prosecuted for failing to comply with the law before it was even passed.

FOSTA Would Chill Innovation

Together, these measures would chill innovation and competition among Internet companies. Large companies like Google and Facebook may have the budgets to survive the massive increase in litigation and liability that FOSTA would bring. They may also have the budgets to implement a mix of automated filters and human censors to comply with the law. Small startups don’t. And with the increased risk of litigation, it would be difficult for new startups ever to find the funding they need to compete with Google.

Today’s large Internet companies would not have grown to prominence without the protections of Section 230. FOSTA would raise the ladder that has allowed those companies to grow, making it very difficult for newcomers ever to compete with them.

FOSTA Would Censor Victims

Congress should think long and hard before dismantling the very tools that have proven most effective in fighting trafficking.

More dangerous still is the impact that FOSTA would have on online speech. Facing the threat of extreme criminal and civil penalties, web platforms large and small would have little choice but to silence legitimate voices. Supporters of SESTA and FOSTA pretend that it’s easy to distinguish online postings related to sex trafficking from ones that aren’t. It’s not—and it’s impossible at the scale needed to police a site as large as Facebook or Reddit. The problem is compounded by FOSTA’s expansion of federal prostitution law. Platforms would have to take extreme measures to remove a wide range of postings, especially those related to sex.

Some supporters of these bills have argued that platforms can rely on automated filters in order to distinguish sex trafficking ads from legitimate content. That argument is laughable. It’s difficult for a human to distinguish between a legitimate post and one that supports sex trafficking; a computer certainly could not do it with anything approaching 100% accuracy. Instead, platforms would have to calibrate their filters to over-censor. When web platforms rely too heavily on automated filters, it often puts marginalized voices at a disadvantage.

Most tragically of all, the first people censored would likely be sex trafficking victims themselves. The very same words and phrases that a filter would use to attempt to delete sex trafficking content would also be used by victims of trafficking trying to get help or share their experiences.

There are many, many stories of traffickers being caught by law enforcement thanks to clues that police officers and others found on online platforms. Congress should think long and hard before dismantling the very tools that have proven most effective in fighting trafficking.

FOSTA Is the Wrong Approach

There is no amendment to FOSTA that would make it effective at fighting online trafficking while respecting the civil liberties of everyone online. That’s because the problem with FOSTA and SESTA isn’t a single provision or two; it’s the whole approach.

Creating more legal tools to go after online platforms would not punish sex traffickers. It would punish all of us, wrecking the safe online communities that we use every day. And in the process, it would also undermine the tools that have proven most effective at putting traffickers in prison. FOSTA is not the right solution, and no trimming around the edges will make it the right solution.

If you care about protecting the safety of our online communities—if you care about protecting everyone’s right to speak online, even about sensitive topics—we urge you to call your representative today and tell them to reject FOSTA.

Take Action

Stop FOSTA

Categories: Privacy

The FCC’s Net Neutrality Order Was Just Published, Now the Fight Really Begins

Thu, 2018-02-22 16:26

Today, the FCC’s so-called “Restoring Internet Freedom Order,” which repealed the net neutrality protections the FCC had previously created with the 2015 Open Internet Order, has been officially published. That means the clock has started ticking on all the ways we can fight back.

While the rule is published today, it doesn’t take effect quite yet. ISPs can’t start blocking, throttling, or paid prioritization for a little while. So while we still have the protections of the 2015 Open Internet Order and we finally have a published version of the “Restoring Internet Freedom Order,” it’s time to act.

First, under the Congressional Review Act (CRA), Congress can reverse a change in regulation with a simple majority vote. That would bring the 2015 Open Internet Order back into effect. Congress has 60 working days—starting from when the rule is published in the official record—to do this. So those 60 days start now.

The Senate bill has 50 supporters, only one away from the majority it needs to pass. The House of Representatives is a bit further away. By our count, 114 representatives have made public commitments in support of voting for a CRA action. Now that time is ticking down for the vote, tell Congress to save the existing net neutrality rules.

Second, it is now unambiguous that the lawsuits of 22 states, public interest groups, Mozilla, and the Internet Association can begin. While the FCC decision said lawsuits had to wait ten days until after the official publication, there was some question about whether federal law said something else. So while some suits have already been filed, with the 10-day counter from the FCC starting, it’s clear that lawsuits can begin.

And, of course, states and other local governments continue to move forward on their own measures to protect net neutrality. 26 state legislatures are considering net neutrality legislation and five governors have issued executive orders on net neutrality. EFF has some ideas on how state law can stand up to the FCC order. Community broadband can also ensure that net neutrality principles are enacted on a local level. For example, San Francisco is currently looking for proposals to build an open-access network that would require net neutrality guarantees from any ISP looking to offer services over the city-owned infrastructure.

So while the FCC’s vote in December was in direct contradiction to the wishes of the majority of Americans, the publishing of that order means that action can really start to be taken.

Categories: Privacy

When the Copyright Office Meets, the Future Needs a Seat at the Table

Wed, 2018-02-21 12:33

Every three years, EFF's lawyers spend weeks huddling in their offices, composing carefully worded pleas we hope will persuade the Copyright Office and the Librarian of Congress to grant Americans a modest, temporary permission to use our own property in ways that are already legal.

Yeah, we think that's weird, too. But it's been than way ever since 1998, when Congress passed the Digital Millennium Copyright Act, whose Section 1201 established a ban on tampering with "access controls for copyrighted works" (also known as "Digital Rights Management" or "DRM"). It doesn't matter if you want to do something absolutely legitimate, something that there is no law against -- if you have to bypass DRM to do it, it's not allowed.

What's more, if someone wants to provide you with a tool to get around the DRM, they could face up to five years in prison and a $500,000 fine, for a first offense, even if the tool is only ever used to accomplish legal, legitimate ends.

Which brings us back to EFF's lawyers, sweating over their briefs every three years. The US Copyright Office holds proceedings every three years to determine whether it should recommend that the Librarian of Congress grant some limited exemptions to this onerous rule. Every three years, EFF begs for -- and wins -- some of these exemptions, by explaining how something people used to be able to do has been shut down by DMCA 1201 and the DRM it supports.

But you know what we don't get to do? We don't get to ask for the right to break DRM to do things that no one has ever thought of -- at least, that they haven't thought of yet. We don't get to brief the Copyright Office on the harms to companies that haven't been founded yet, the gadgets they haven't designed yet, and the users they haven't attracted yet. Only the past gets a seat at the table: the future isn't welcome.

That's a big problem. Many of the tools and technologies we love today were once transgressive absurdities: mocked for being useless and decried as immoral or even criminal. The absurd transgressors found ways to use existing techologies and products to build new businesses, over the howls of objections from the people who'd come before them.

It's a long and honorable tradition, and without it, we wouldn't have cable TV (reviled as thieves by the broadcasters in their early days); Netflix (called crooks by the Hollywood studios for mailing DVDs around in red envelopes); or iTunes ("Rip, Mix, Burn" was damned as a call to piracy by the record industry).

These businesses exist because they did something that wasn't customary, something rude and disorderly and controversial -- they did things that were legal, but unsanctioned by the businesses they were doing those things to.

And today, as these businesses have reached maturity, the so-called pirates have become admirals. Today, these former disruptors also use DRM and are glad that bypassing their DRM to do something legal is banned (because their shareholders prefer it that way).

Those companies aren't doing themselves any favors, either. Even as Apple was asking the Copyright Office to ban third-party modifications to the iPhone, it was copying these unauthorized innovations and including them in the official versions of its products.

Our Catalog of Missing Devices gives you a sense of what we've lost because DMCA 1201 has given the companies that succeeded last year the right to decide who can compete with them in the years to come.

It's a year that's divisible by three, and that means that EFF is back at the Copyright Office, pleading for the right of the past to go on in the present -- but we can't ask the Copyright Office to protect the future, the DMCA doesn't allow it.

That's why we've sued the US Government to invalidate Section 1201 of the DMCA: Congress made a terrible blunder in 1998 when it created that law, and the effects of that blunder mount with each passing year. We need to correct it -- and the sooner, the better.

Categories: Privacy

The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation

Tue, 2018-02-20 19:30

In the coming decades, artificial intelligence (AI) and machine learning technologies are going to transform many aspects of our world. Much of this change will be positive; the potential for benefits in areas as diverse as health, transportation and urban planning, art, science, and cross-cultural understanding are enormous. We've already seen things go horribly wrong with simple machine learning systems; but increasingly sophisticated AI will usher in a world that is strange and different from the one we're used to, and there are serious risks if this technology is used for the wrong ends.

Today EFF is co-releasing a report with a number of academic and civil society organizations1 on the risks from malicious uses of AI and the steps that should be taken to mitigate them in advance.

At EFF, one area of particular concern has been the potential interactions between computer insecurity and AI. At present, computers are inherently insecure, and this makes them a poor platform for deploying important, high-stakes machine learning systems. It's also the case that AI might have implications for computer [in]security that we need to think about carefully in advance. The report looks closely at these questions, as well as the implications of AI for physical and political security. You can read the full document here.

Categories: Privacy

Did Congress Really Expect Us to Whittle Our Own Personal Jailbreaking Tools?

Tue, 2018-02-20 14:46

In 1998, Congress passed the Digital Millennium Copyright Act (DMCA), and profoundly changed the relationship of Americans to their property.

Section 1201 of the DMCA bans the bypassing of "access controls" for copyrighted works. Originally, this meant that even though you owned your DVD player, and even though it was legal to bring DVDs home with you from your European holidays, you weren't allowed to change your DVD player so that it would play those out-of-region DVDs. DVDs were copyrighted works, the region-checking code was an access control, and so even though you owned the DVD, and you owned the DVD player, and even though you were allowed to watch the disc, you weren't allowed to modify your DVD player to play your DVD (which you were allowed to watch).

Experts were really worried about this: law professors, technologists and security experts saw that soon we'd have software—that is, copyrighted works—in all kinds of devices, from cars to printer cartridges to voting machines to medical implants to thermostats. If Congress banned tinkering with the software in the things you owned, it would tempt companies to use that software to create "private laws" that took away your rights to use your property in the way you saw fit. For example, it's legal to use third party ink in your HP printer, but once HP changed its printers to reject third-party ink, they could argue that anything you did to change them back was a violation of the DMCA.

Congress's compromise was to order the Library of Congress and the Copyright Office to hold hearings every three years, in which the public would be allowed to complain about ways in which these locks got in the way of their legitimate activities. Corporations weigh in about why their business interests outweigh your freedom to use your property for legitimate ends, and then the regulators deliberate and create some temporary exemptions, giving the public back the right to use their property in legal ways, even if the manufacturers of their property don't like it.

If it sounds weird that you have to ask the Copyright Office for permission to use your property, strap in, we're just getting started.

Here's where it gets weird: DMCA 1201 allows the Copyright Office to grant "use" exemptions, but not "tools" exemptions. That means that if the Copyright Office likes your proposal, they can give you permission to jailbreak your gadgets to make some use (say, install third-party apps on your phone, or record clips from your DVDs to use in film studies classes), but they can't give anyone the right to give you the tool needed to make that use (law professor and EFF board member Pam Samuelson argues that the Copyright Office can go farther than this, at least some of the time, but the Copyright Office disagrees).

Apparently, fans of DMCA 1201 believe that the process for getting permission to use your own stuff should go like this:

1. A corporation sells you a gadget that disallows some activity, or they push a software update to a gadget you already own to take away a feature it used to have;

2. You and your lawyers wait up to three years, then you write to the Copyright Office explaining why you think this is unfair;

3. The corporation that made your gadget tells the Copyright Office that you're a whiny baby who should just shut up and take it;

4. You write back to the Copyright Office to defend your use;

5. Months later, the Library of Congress gives you a limited permission to use your property (maybe);

And then...

6. You get a degree in computer science, and subject your gadget to close scrutiny to find a flaw in the manufacturer's programming;

7. Without using code or technical information from anyone else (including other owners of the same gadget) you figure out how to exploit that flaw to let you use your device in the way the government just said you could;

8. Three years later, you do it again.

Now, in practice, that's not how it works. In practice, people who want to use their own property in ways that the Copyright Office approves of just go digging around on offshore websites, looking for software that lets them make that use. (For example, farmers download alternative software for their John Deere tractors from websites they think might be maintained by Ukrainian hackers, though no one is really sure). If that software bricks their device, or steals their personal information, they have no remedy, no warranty, and no one to sue for cheating them.

That's the best case.

But often, the Library of Congress makes it even harder to make the uses they're approving. In 2015, they granted car owners permission to jailbreak their cars in order to repair them—but they didn't give mechanics the right to jailbreak the cars they were fixing. That ruling means that you, personally, can fix your car, provided that 1) you know how to fix a car; and 2) you can personally jailbreak the manufacturer's car firmware (in addition to abiding by the other snares in the final exemption language).

In other cases, the Copyright Office limits the term of the exemption as well as the scope: in the 2015 ruling, the Copyright Office gave security researchers the right to jailbreak systems to find out whether they were secure enough to be trusted, but not industrial systems (whose security is very important and certainly needs to be independently verified by those systems' owners!) and they also delayed the exemption's start for a full year, meaning that security researchers would only get two years to do their jobs before they'd have to go back to the Copyright Office and start all over again.

This is absurd.

Congress crafted the exemptions process to create an escape valve on the powerful tool it was giving to manufacturers with DMCA 1201. But even computer scientists don't hand-whittle their own software tools for every activity: like everyone else, they rely on specialized toolsmiths who make software and hardware that is tested, warranted, and maintained by dedicated groups, companies and individuals. The idea that every device in your home will have software that limits your use, and you can only get those uses back by first begging an administrative agency and then gnawing the necessary implement to make that use out of the lumber of your personal computing environment is purely absurd.

The Copyright Office is in the middle of a new rulemaking, and we've sent in requests for several important exemptions, but we're not kidding ourselves here: as important as it is to get the US government to officially acknowledge that DMCA 1201 locks up legitimate activities, and to protect end users, without the right to avail yourself of tools, the exemptions don't solve the whole problem.

That's why we're suing the US government to invalidate DMCA 1201. DMCA 1201 wasn't fit for purpose in 1998, and it has shown its age and contradictions more with each passing year.

Categories: Privacy