![]() |
by Leigh Beadon on (#3K1C9)
Five Years AgoThis week in 2013, EA/Maxis was dealing with the fallout from its disastrous SimCity launch, which was ruined by always-online DRM (which, it turns out, was also disastrously hackable), by offering up tonedeaf responses while giving away earlier versions of the game as a weak apology. They were drawing ire from other developers, and then things got worse as a security hole was discovered in EA's Origin platform itself. Meanwhile, we were digging in to copyright boss Maria Pallante's call for comprehensive, forward-thinking copyright reform, which included some good ideas like not seeing personal downloading as piracy, but was still largely focused on bad ideas.Ten Years AgoThis week in 2008, the makers of e-voting machines were doing everything they could to avoid scrutiny, so while machines in Ohio were declared a crime scene, Sequoia was trying to keep Ed Felten away from reviewing its machines and succeeded in scaring officials into backing down — all while a new study showed a massive error rate in e-voting.This was also the week that the world lost Arthur C. Clarke.Fifteen Years AgoIt was this week in 2003 that the US invaded Iraq. Though the war didn't dominate our writing on Techdirt, we did take a look at the businesses rapidly moving to explore whether this would help or hurt them, and the discussion around how this was the first true war of the internet era and the implications of that for journalists. And it didn't take long for "war" to oust "sex" and "Britney Spears" as the top internet search.Also this week: the RIAA moved into the suing-companies phase of its anti-file sharing crusade; a Texas congressman wanted to throw college students in jail for file-sharing, though surveys of students showed they had a much more modern understanding of the issues at stake; and MIT's tech review continued sounding the warning bells about America becoming a surveillance nation.
|
Techdirt
Link | https://www.techdirt.com/ |
Feed | https://www.techdirt.com/techdirt_rss.xml |
Updated | 2025-08-25 20:03 |
![]() |
by Tim Cushing on (#3K01Q)
Predictive policing software -- developed by Palantir and deployed secretly by the New Orleans Police Department for nearly six years -- is at the center of a criminal prosecution. The Verge first reported the NOPD's secret use of Palantir's software a few weeks ago, something only the department and the mayor knew anything about.
|
![]() |
by Karl Bode on (#3JZN6)
A few years back, you might recall that there was a period of immense government and media hyperventilation over allegations that Chinese hardware vendor Huawei spied on an American consumers. Story after story engaged in hysterical hand-wringing over this threat, most of them ignoring that Chinese gear and components are everywhere, including in American products. So the government conducted an 18 month investigation into those allegations and found that there was no evidence whatsoever to support allegations that Huawei spies on Americans via its products. One anonymous insider put it this way at the time:
|
![]() |
by Cathy Gellis on (#3JZCX)
Last week the Tenth Circuit refused to let New Mexico's anti-SLAPP statute be used in federal court in diversity cases. The relatively good news about the decision is that it is premised heavily on the specific language of New Mexico's statute and may not be easily extensible to other states' anti-SLAPP laws. This focus on the specific language is also why, as the decision acknowledges, it is inconsistent with holdings in other circuits, such as the Ninth. But the bad news is that the decision still takes the teeth out of New Mexico's statute and will invite those who would abuse judicial process in order to chill speech to bring actions that can get into the New Mexico federal courts.In this case, there had been litigation pending in New Mexico state court. That litigation was then removed to federal court on the basis of "diversity jurisdiction." Diversity jurisdiction arises when the parties in the litigation are from separate states and the amount in controversy is more than $75,000 and the issue in dispute is solely a question of state law. Federal courts ordinarily can't hear cases that only involve state law, but because of the concern that it could be unfair for an out-of-state litigant to have to be heard in a foreign state court, diversity jurisdiction can allow a case that would have been heard in state court to be heard by the federal one for the area instead.At the same time, we don't want it to be unfair for the other party to now have to litigate in federal court if being there means it would lose some of the protection of local state law. We also don't want litigants to be too eager to get into federal court if being there could confer an advantage they would not have had if the case were instead being heard in state court. These two policy goals underpin what is commonly known as the "Erie doctrine," named after a 1938 US Supreme Court case that is still followed today.The Erie doctrine is why a case removed to federal court will still use state law to decide the matter. But sometimes it's hard to figure out how much state law needs to be used. Federal courts have their own procedural rules, for instance, and so they are not likely to use procedural state rules to govern their proceedings. They only will use substantive state law. But it turns out that figuring out which a law is, procedural or substantive, is anything but straightforward, and that is the question at the heart of this Tenth Circuit case: was New Mexico's anti-SLAPP law procedural, in which case the federal court did not have to follow it, or substantive, in which case it did? And unfortunately in this case, Los Lobos Renewable Power LLP v. Americulture, Inc., the Tenth Circuit decided it was "hardly a challenging endeavor" to decide that it was only procedural.It based a significant portion of its decision on language unique to the New Mexico statute that differed from other states' and emphasized its procedural operation:
|
![]() |
by Mike Masnick on (#3JZ4T)
As we've been discussing all week, a lot of people are reacting to the wrong thing in the whole Facebook / Cambridge Analytica mess. The problem was not that Facebook had an open API -- but that its users were unaware of what was happening with their own data. Unfortunately, many, many people (including the press and politicians) are running with the narrative that Facebook failed to "protect" data. And, just as we warned, the coming "solutions" won't help matters, but will actually make them worse.Case in point: when Mark Zuckerberg finally made his big press tour on Wednesday evening, he repeatedly told people that, the public has spoken and Facebook will lock down your data now.
|
![]() |
by Tim Cushing on (#3JYZF)
The security researcher who was at the center of an audacious and disturbing government demand to unmask several Twitter accounts on the basis of an apparently menacing smiley emoji contained in one of them is now facing zero prison time for his supposed harassment of an FBI agent. Justin Shafer, who was originally facing five felony charges, has agreed to plead guilty to a single misdemeanor charge. Shafer, who spent eight months in jail for blogging about the FBI raiding his residence repeatedly, is finally going home.Here are the details of plea agreement [PDF] Shafer has agreed to. (h/t DissentDoe]
|
![]() |
by Daily Deal on (#3JYWQ)
The Computer Hacker Professional Certification Bundle has 60+ hours of prep for CISM, CISA, and CISSP Certification Exams to help you train to be an ethical hacker. In this bundle, you'll master the skills of hacking and penetration in order to learn how to defeat malicious hackers. You will learn about the role of social engineering in stealing confidential information, how to apply integrity controls and different types of encryption, and how to carry out an investigation according to industry best practices and legal guidelines. Three courses cover the basics of what you'll need to know to sit various certification exams. The bundle is on sale for $49.Note: The Techdirt Deals Store is powered and curated by StackCommerce. A portion of all sales from Techdirt Deals helps support Techdirt. The products featured do not reflect endorsements by our editorial team.
|
![]() |
by Mike Masnick on (#3JYPP)
It's not like people didn't warn about this. But, following Congress passing SESTA (likely to be signed soon by the President), a bunch of sites are already starting to make changes. Craiglist is probably the most notable, announcing that it was completely shutting down its Personals Section:
|
![]() |
by Karl Bode on (#3JY6X)
We've noted for some time how the broadband industry fights tooth and nail against more accurate broadband availability mapping, since having a better understanding of the broadband industry's competition problem might just result in somebody actually doing something about it. This dysfunction and apathy was most recently illustrated with the FCC's recent release of an "updated" broadband availability map, which all but hallucinates competition, speeds, and overall availability. This map (available here) also omits pricing data at industry behest, resulting in a $300 million pair of rose-colored glasses.But it's not just the FCC's broadband availability map that's under fire. FCC maps that determine which area get wireless subsidies (more specifically Mobility Fund Phase II (MF II) funding) are also a bad joke for many of the same reasons. As such, a group of Senators from both parties fired off a letter to the FCC last week, politely pointing out how the FCC's new wireless coverage map dramatically overstates the availability of wireless broadband service:
|
![]() |
by Tim Cushing on (#3JXV5)
As Spain continues to expand its (anti-)speech laws, the rights of its citizens continue to contract. Not content with making it illegal to insult a cop or government officials, the Spanish government has decided to tackle hate speech and terrorism with the same ineptitude.There's no punchline here. People are being arrested and charged with speech having nothing to with promoting hate or terrorism. And this is in addition to people who've found themselves targeted by vindictive public servants for daring to publicly criticize their words or actions.It's gotten so bad Amnesty International -- an entity that usually spends its time decrying the acts of dictators and brutal authoritarians -- has felt compelled to speak up about Spain's terrible speech laws. Mathew Ingram has more details at Columbia Journalism Review.
|
![]() |
by Timothy Geigner on (#3JX6Q)
Missing from far too many of the stories we post on trademark bullies is anything amounting to blowback. While it happens on occasion, the reason that trademark bullying works is due to the costs for any sort of defense, nevermind the cost that would be required to actually go on the offense against a bully. Still, that isn't to say that when a trademark bully picks a fight that it cannot sometimes lead to a backfire.That appears to be the risk Chicago's famous Billy Goat Tavern now faces after it sued Billy Goat Chip Co., given the countersuit and factual response made by the chip company. Billy Goat Tavern filed suit in 2017, alleging that the St. Louis potato chip maker was infringing on its trademark with its name and logo, which uses the silhouette of a rearing billy goat. For what it's worth, the tavern's logo is completely different and features a fully detailed cartoon head of a goat, not a black outline like the chip company.But based on the information in the countersuit, it seems there is much more factual information the tavern ought to have considered before filing its initial lawsuit.
|
![]() |
by Timothy Geigner on (#3JWST)
In the wake of a Tempe, Arizona woman being struck and killed by an Uber autonomous vehicle, there has been a flurry of information coming out about the incident. Despite that death being one of eleven in the Phoenix area alone, and the only one involving an AV, the headlines were far closer to the "Killer Car Kills Woman" sort than they should have been. Shortly after the crash, the Tempe Police Chief went on the record suggesting that the victim had at least some culpability in the incident, having walked outside of the designated crosswalk and that the entire thing would have been difficult for either human or AI to avoid.Strangely, now that the video from Uber's onboard cameras have been released, the Tempe police are trying to walk that back and suggest that reports of the Police Chief's comments were taken out of context. That likely is the result of the video footage showing that claims that the victim "darted out" in front of the car are completely incorrect.
|
![]() |
by Mike Masnick on (#3JWKX)
We were concerned, last month, by the appeals court ruling in the Cox v. BMG case regarding the DMCA's repeat infringer policy rules, though the more I've reread that ruling, I've become less bothered by it. While I'm still concerned about how bad decisions by Cox created potentially bad law, there are enough specifics in the ruling that hopefully will limit the impact to specific circumstances. In particular, whereas Cox was found to not have implemented a "reasonable" termination policy for repeat infringers, the court does acknowledge that the law means that the platforms have wide leeway in determining what their termination policy should be. The real problem for Cox was that it appeared not to actually follow its own policy, and thus did not reasonably implement it.That was over in the 4th Circuit. Last week, the 9th Circuit ruled on a case where there were also questions about a repeat infringer policy, and the ruling is a clean ruling in defense of platforms determining their own rules for terminating repeat infringers. The case, Ventura Content v. Motherless, involves a porn producer suing a site that allowed user uploads of porn. From the description in the case, Motherless qualifies for the DMCA's safe harbors as a site where the content is submitted by users, and the ruling goes into great detail about the steps that Motherless's sole employee, Joshua Lange, goes through to review content uploaded to the site to make sure it doesn't violate the site's terms (which mostly seem aimed at blocking child porn). Motherless also appears to follow a pretty standard DMCA takedown process. Actually, the site appears to go beyond what is legally required in accepting notices that don't even meet the DMCA notice standard, and removing much of the notified content.While the site did not have a written out "repeat infringer policy," Lange did have some mental metrics he used in reviewing accounts, and did shut off ones that were receiving lots of copyright takedown notices.
|
![]() |
by Tim Cushing on (#3JWAB)
Just as the Supreme Court is considering the legality of extraterritorial demands for communications held by US internet service providers in overseas data storage, Congress is doing all it can to short-circuit the debate. Tucked away towards the back of a 2,200-page spending bill is something called the "Clarifying Lawful Overseas Use of Data Act" or (of course) "CLOUD Act." (h/t Steve Vladeck)The CLOUD Act [PDF - starting at p. 2201] would make any decision by the Supreme Court extraneous. If it agrees with Microsoft -- as lower courts have -- that the US has no right to demand communications stored overseas with a normal warrant, the Act would immediately overturn the decision. If it decides against Microsoft, it will be aligned with the new law. As it stands now, the route most likely to be taken by the Supreme Court is a punt. Legislation on point is in play and the Court will probably be more than happy to let legislators make the final call.Beyond the obvious problem of giving US law enforcement permission to use regular warrants to bypass mutual assistance treaties, the law also allows for reciprocation. We can't go around waving SCA (Stored Communications Act) warrants in foreign lands without expecting pushback from locals. So, we'll have to give foreign countries the same privileges, even if the criminal charges being investigated wouldn't be considered criminal acts in this country and the country enjoying this reciprocation doesn't care much about its own citizens' rights and privacy.The EFF is especially critical of the shoehorned-in CLOUD Act. As it points out, the law would result in backdoor searches of anyone's communications via reciprocal communication demands. In the US, we've already seen the Fourth Amendment circumvented by US government agencies via their access to NSA collections. The same would happen in reverse when other countries start playing by the CLOUD Act's new rules.
|
![]() |
by Cathy Gellis on (#3JW7A)
Hold on tight to those memories of all the good things the Internet has brought. SESTA has just passed the Senate, and at this point it's a clear legislative path to undermining Section 230, the law that has enabled all those good things the Internet has offered.It is not entirely Facebook's fault: opportunists from Hollywood saw it as a chance to weaken the innovation that weakens their antiquated grip over people's creativity. Ill-informed celebrities, who understood absolutely nothing about the cause they professed to advocate for, pressed their bumper-sticker demands that something be done, even though that something is destructive to the very cause the bumper-stickers were for. Willfully ignorant members of Congress then bought into the bumper-sticker rhetoric, despite all the evidence they had about how destructive this law would be to those interests and online speech generally.Even frequent innovation ally Senator Wyden joined the chorus mounting against the tech industry, lending credence to the idea that when it came to a law that would undermine the Internet, the Internet had it coming.
|
![]() |
by Daily Deal on (#3JW7B)
Keep your skills sharp and stay up to date on new developments with the $89 Virtual Training Company Unlimited Single User Subscription. With courses covering everything from MCSE certification training to animation, graphic design and page layout, you'll have unlimited access to the entire catalog. They have over 1,000 courses, add more each week, and each course comes with a certificate of completion.Note: The Techdirt Deals Store is powered and curated by StackCommerce. A portion of all sales from Techdirt Deals helps support Techdirt. The products featured do not reflect endorsements by our editorial team.
|
![]() |
by Tim Cushing on (#3JVYC)
Another Alabama sheriff has been caught abusing a law that's inexplicably still on the books. Over the course of three years, Etowah County Sheriff Todd Entrekin took home at least $750,000 in funds meant to be used to feed inmates in his jail. Thanks to another bad law, there's no telling how much more than $750,000 Entrekin has pocketed, but he certainly seems to have a lot of disposable income. (h/t Guy Hamilton-Smith)
|
![]() |
by Tim Cushing on (#3JVVZ)
Another Alabama sheriff has been caught abusing a law that's inexplicably still on the books. Over the course of three years, Etowah County Sheriff Todd Entrekin took home at least $750,000 in funds meant to be used to feed inmates in his jail. Thanks to another bad law, there's no telling how much more than $750,000 Entrekin has pocketed, but he certainly seems to have a lot of disposable income. (h/t Guy Hamilton-Smith)
|
![]() |
by Karl Bode on (#3JVAQ)
Last year you might recall that the GOP and Trump administration rushed to not only kill net neutrality at ISP lobbyist behest, but also some pretty basic but important consumer privacy rules. The protections, which would have taken effect in March of 2017, simply required that ISPs be transparent about what personal data is collected and sold, while mandating that ISPs provide consumers with the ability to opt of said collection. But because informed and empowered consumers damper ad revenues, ISPs moved quickly to have the rules scuttled with the help of cash-compromised lawmakers.When California lawmakers stepped in to then try and pass their own copy of those rules, ISPs worked in concert with Google and Facebook to scuttle those rules as well. As the EFF documented at the time, Google, Facebook, AT&T, Verizon and Comcast all collectively and repeatedly lied to state lawmakers, claiming the planned rules would harm children, increase internet popups, and somehow "embolden extremism" on the internet. The misleading lobbying effort was successful, and the proposal was quietly scuttled without too much fanfare in the tech press.Obviously this behavior has some broader implications in the wake of the Cambridge Analytica scandal. Especially given Facebook's insistence this week that it's open to being regulated on privacy, and is "outraged" by "deception" as it tries (poorly) to mount a sensible PR response to the entire kerfuffle:
|
![]() |
by Mike Masnick on (#3JTWW)
It took way too long, but Mark Zuckerberg finally responded to the Cambridge Analytica mess on Wednesday afternoon. As we've discussed in a series of posts all week, this is a complex issue, where a lot of people are getting the details wrong, and thus most of the suggestions in response are likely to make the problem worse.To be clear, Mark's statement on the issue is not bad. It's obviously been workshopped through a zillion high-priced PR people, and it avoids all the usual "I"m sorry if we upset you..." kind of tropes. Instead, it's direct, it takes responsibility, it admits error, does very little to try to "justify" what happened, and lists out concrete steps that the company is taking in response to the mess.
|
![]() |
by Mike Masnick on (#3JT3W)
This was not unexpected, but earlier today the Senate easily passed SESTA/FOSTA (the same version the House passed a few weeks ago) by a 97 to 2 vote -- with only Senators Ron Wyden and Rand Paul voting against it. We've explained in great detail why the bill is bad. We've explained in great detail why the bill won't stop sex trafficking and will actually put sex workers' lives in more danger, while also stomping on free speech and the open internet at the same time (which some see as a feature rather than a bug). The Senate declined to put any fixes in place.Senator Wyden, who had originally offered up an amendment that would have fixed at least one big problem with the bill (clarifying that doing any moderation doesn't subject you to liability for other types of content) pulled the amendment right before the vote, noting that there had been a significant, if dishonest, lobbying effort to kill those amendments, meaning it had no chance. He did note that because of the many problems of the bill, he fully expects that these issues will be revisited shortly.As for the many problems of the bill... well, they are legion, starting with the fact that multiple parts of the bill appear to be unconstitutional. That's most obvious in the "ex post facto" clause that applies the new criminal laws to activities in the past, which is just blatantly unconstitutional. There are some other serious questions about other parts of the bill, including concerns about it violating the First Amendment as well. It seems likely that the law will be challenged in court soon enough.In the meantime, though, the damage here is real. The clearest delineation of the outright harm this bill will cause can be seen in a Twitter thread from a lawyer who represents victims of sex trafficking, who tweeted last night just how much damage this will do. It's a long Twitter thread, but well worth reading. Among other things, she notes that sites like Backpage were actually really useful for finding victims of sex trafficking and in helping them get out of dangerous situations. She talks about how her own clients would disappear, and the only way she could get back in touch with them to help them was often through these platforms. And all that will be gone, meaning that more people will be in danger and it will be that much harder for advocates and law enforcement to help them. She similarly notes that many of the groups supporting SESTA "haven't gotten their hands dirty in the field" and don't really understand what's happening.That's true on the internet side as well. Mike Godwin highlights the history before CDA 230 was law and the kinds of problems that come about when you make platforms liable for the speech of their users.
|
![]() |
by Mike Masnick on (#3JSTZ)
Last week, we wrote a bit about Donald Trump's nominee to head the CIA, Gina Haspel. That post highlighted a bunch of reporting about Haspel's role in running a CIA blacksite in Thailand that was a key spot in the CIA's torture program. Soon after we published it, ProPublica retracted and corrected an earlier piece -- on which much of the reporting about Haspel's connection to torture relied on. Apparently, ProPublica was wrong on the date at which Haspel started at the site, meaning that she took over soon after the most famous torture victim, Abu Zaubaydah, was no longer being tortured. Thus earlier claims that she oversaw his inhumane, brutal, and war crimes-violating torture were incorrect. To some, this error, has been used to toss out all of the concerns and complaints about Haspel, even though reporters now agree that she did oversee the torture of at least one other prisoner at a time when other CIA employees were seeking to transfer out of the site out of disgust for what the CIA was doing.However, what this incident should do is make it clear that the Senate should not move forward with Haspel's nomination unless the details of her involvement is declassified. As Trevor Timm notes, ProPublica's error was not due to problematic reporting, but was the inevitable result of the CIA hiding important information from the public.
|
![]() |
by Tim Cushing on (#3JSJB)
Here's an idea for the FBI, gift-wrapped and signed "From Russia, With Love."
|
![]() |
by Mike Masnick on (#3JS41)
In my last post, I described why it was wrong to focus on claims of Facebook "selling" your data as the "problem" that came out over the weekend concerning Cambridge Analytica and the data it had on 50 million Facebook users. As we described in detail in that post, that's not the problem at all. Instead, much of the problem has to do with Facebook's utter failure to be transparent in a way that matters -- specifically in a way that its users actually understand what's happening (or what may happen) to their data. Facebook would likely respond that it has tried to make that information clear (or, alternatively, it may say that it can't force users to understand what they don't take the time to understand). But I don't think that's a good answer. As we've learned, there's a lot more at stake here than I think even Facebook recognized, and providing much more real transparency (rather than superficial transparency) is what's necessary.But that's not what most people are suggesting. For example, a bunch of people are calling for "Know Your Customer" type regulations similar to what's found in the financial space. Others seem to just be blindly demanding "oversight" without being able to clearly articulate what that even means. And some are bizarrely advocating "nationalizing Facebook", which would literally mean giving billions in taxpayer dollars to Mark Zuckerberg. But these "solutions" won't solve the actual issues. In that article about "KYC" rules, there's the following, for example:
|
![]() |
by Daily Deal on (#3JS1N)
The Project Management Professional Certification Training Bundle features 10 courses designed to get you up and running as a project manager. You'll prepare for certification exams by learning the fundamental knowledge, terminology, and processes of effective project management. Various methods of project management are covered as well including Six Sigma, Risk Management, Prince and more. The bundle is on sale for $49.Note: The Techdirt Deals Store is powered and curated by StackCommerce. A portion of all sales from Techdirt Deals helps support Techdirt. The products featured do not reflect endorsements by our editorial team.
|
![]() |
by Mike Masnick on (#3JRTG)
Obviously, over the past few days there's been plenty of talk about the big mess concerning Cambridge Analytica using data on 50 million Facebook users. And, with that talk has come all sorts of hot takes and ideas and demands -- not all of which make sense. Indeed, it appears that there's such a rush to condemn bad behavior that many are not taking the time to figure out exactly what bad behavior is worth condemning. And that's a problem. Because if you don't understand the actual bad behavior, then your "solutions" will be misplaced. Indeed, they could make problems worse. And... because I know that some are going to read this post as a defense of Facebook, let me be clear (as the title of this post notes): Facebook has many problems, and has done a lot of bad things (some of which we'll discuss below). But if you mischaracterize those "bad" things, then your "solutions" will not actually solve them.One theme that I've seen over and over again in discussions about what happened with Facebook and Cambridge Analytica is the idea that Facebook "sold" the data it had on users to Cambridge Analytica (alternatively that Cambridge Analytica "stole" that data). Neither is accurate, and I'm somewhat surprised to see people who are normally careful about these things -- such as Edward Snowden -- harping on the "selling data" concept. What Facebook actually does is sell access to individuals based on their data and, as part of that, open up the possibility for users to give some data to companies, but often unwittingly. There's a lot of nuance in that sentence, and many will argue that for all reasonable purposes "selling data" and my much longer version are the same thing. But they are not.So before we dig into why they're so different, let's point out one thing that Facebook deserves to be yelled at over: it does not make this clear to users in any reasonable way. Now, perhaps that's because it's not easy to make this point, but, really, Facebook could at least do a better job of explaining how all of this works. Now, let's dig in a bit on why this is not selling data. And for that, we need to talk about three separate entities on Facebook. First are advertisers. Second are app developers. Third are users.The users (all of us) supply a bunch of data to Facebook. Facebook, over the years, has done a piss poor job of explaining to users what data it actually keeps and what it does with that data. Despite some pretty horrendous practices on this front early on, the company has tried to improve greatly over the years. And, in some sense, it has succeeded -- in that users have a lot more granular control and ability to dig into what Facebook is doing with their data. But, it does take a fair bit of digging and it's not that easy to understand -- or to understand the consequences of blocking some aspects of it.The advertisers don't (as is all too commonly believed) "buy" data from Facebook. Instead, the buy the ability to put ads into the feeds of users who match certain profiles. Again, some will argue this is the same thing. It is not. From merely buying ads, the advertiser gets no data in return about the users. It just knows what sort of profile info it asked for the ads to appear against, and it knows some very, very basic info about how many people saw or interacted with the ads. Now, if the ad includes some sort of call to action, the advertiser might then gain some information directly from the user, but that's still at the user's choice.The app developer ecosystem is a bit more complicated. Back in April of 2010, Facebook introduced the Open Graph API, which allowed app developers to hook into the data that users were giving to Facebook. Here's where "things look different in retrospect" comes into play. The original Graph API allowed developers to access a ton of information. In retrospect, many will argue that this created a privacy nightmare (which, it kinda did!), but at the same time, it also allowed lots of others to build interesting apps and services, leveraging that data that users themselves were sharing (though, not always realizing they were sharing it). It was actually a move towards openness in a manner that many considered benefited the open web by allowing other services to build on top of the Facebook social graph.There is one aspect of the original API that does still seem problematic -- and really should have been obviously problematic right from the beginning. And this is another thing that it's entirely appropriate to slam Facebook for not comprehending at the time. As part of the API, developers could not only get access to all this information about you... but also about your friends. Like... everything. From the original Facebook page, you can see all the "friend permissions" that were available. These are better summarized in the following chart from a recent paper analyzing the "collateral damage of Facebook apps."If you can't read that... it's basically a ton of info from friends, including their likes, birthdays, activities, religion, status updates, interests, etc. You can kind of understand how Facebook ended up thinking this was a good idea. If an app developer was designing an app to provide you a better Facebook experience, it might be nice for that app to have access to all that information so it could display it to you as if you were using Facebook. But (1) that's not how this ever worked (and, indeed, Facebook went legal against services that tried to provide a better Facebook experience) and (2) none of this was made clear to end-users -- especially the idea that in sharing your data with your friends, they might cough up literally all of it to some shady dude pushing a silly "personality test" game.But, of course, as I noted in my original post, in some cases, this set up was celebrated. When the Obama campaign used the app API this way to reach more and more people and collect all the same basic data, it was celebrated as being a clever "voter outreach" strategy. Of course, the transparency levels were different there. Users of the Obama app knew what they were supporting -- though didn't perhaps realize they were revealing a lot of friend data at the same time. Users of Cambridge Analytica's app... just thought they were taking a personality quiz.And that brings us to the final point here: Cambridge Analytica, like many others, used this setup to suck up a ton of data, much of it from friends of people who agreed to install a personality test app (and, a bunch of those users were actually paid via Mechanical Turk to basically cough up all their friends' data). There are reasonable questions about why Facebook set up its API this way (though, as noted above, there were defensible, if short-sighted, reasons). There are reasonable questions about why Facebook wasn't more careful about watching what apps were doing with the data they had access to. And, most importantly, there are reasonable questions about how transparent Facebook was to its end users through all of this (hint: it was not at all transparent).So there are plenty of things that Facebook clearly did wrong, but it wasn't about selling data to Cambridge Analytica and it wasn't Cambridge Analytica "stealing" data. The real problem was in how all of this was hidden. It comes back to transparency. Facebook could argue that this information was all "public" -- which, uh, okay, it was, but it was not public in a way that the average Facebook user (or even most "expert" Facebook users) truly understood. So if we're going to bash Facebook here, it should be for the fact that none of this was clear to users.Indeed, even though Facebook shut down this API in April of 2015 (after deprecating it in April of 2014), most users still had no idea just how much information Facebook apps had about them and their friends. Today, the new API still coughs up a lot more info than people realize about themselves (and, again, that's bad and Facebook should improve that), but no longer your friends' data as well.So slam Facebook all your want for failing to make this clear. Slam Facebook for not warning users about the data they were sharing -- or that their friends could share. Slam Facebook for not recognizing how apps were sucking up this data and the privacy implications related to that. But don't slam Facebook for "selling your data" to advertisers, because that's not what happened.I was going to use this post to also discuss why this misconception is leading to bad policy prescriptions, but this one is long enough already, so stay tuned for that one next. Update: And here's that post.
|
![]() |
by Karl Bode on (#3JR9C)
To be very clear, Facebook is well deserving of the mammoth backlash the company is experiencing in the wake of the Cambridge Analytica revelations. Especially since Facebook's most substantive reaction to date has been to threaten lawsuits against news outlets for telling the truth. And, like most of these stories, it's guaranteed that the core story is only destined to get worse as more and more is revealed about the way such casual handling of private consumer data is pretty much routine not only at Facebook, but everywhere.Despite the fact that consumer privacy apathy is now bone-grafted to the DNA of global corporate culture (usually only bubbling up after a scandal breaks), the outrage over Facebook's lack of transparency has been monumental.Verizon-owned Techcrunch, for example, this week went so far as to call Facebook a "cancer," demanding that readers worried about privacy abuses delete their Facebook accounts. The #Deletefacebook hashtag has been trending, and countless news outlets have subsequently provided wall to wall coverage on how to delete your Facebook account (or at least delete older Facebook posts and shore up app-sharing permissions) in order to protect your privacy.And while this outrage is well-intentioned and certainly justified, a lot of it seems a touch naive. Many of the folks that are busy deleting their Facebook accounts are simultaneously still perfectly happy to use their stock smartphone on a major carrier network, seemingly oblivious to the ugly reality that the telecom sector has been engaged, routinely, in far worse privacy violations for the better part of the last two decades. Behavior that has just as routinely failed to see anywhere near the same level of outrage by consumers, analysts or the tech press.You'll recall that a decade ago, ISPs were caught routinely hoovering up clickstream data (data on each and every website you visit), then selling it to whoever was willing to pony up the cash. When ISPs were asked to share more detail on this data collection by the few outlets that thought this might not be a good idea, ISP executives would routinely play dumb and mute (they still do). And collectively, the lion's share of the press and public generally seemed OK with that.From there, we learned that AT&T and Verizon were effectively bone grafted to the nation's intelligence apparatus, and both companies were caught routinely helping Uncle Sam not only spy on Americans without warrants, but providing advice on how best to tap dance around wiretap and privacy laws. When they were caught spying on Americans in violation of the law, these companies' lobbyists simply convinced the government to change the law to make this behavior retroactively legal. Again, I can remember a lot of tech news outlets justifying this apathy for national security reasons.Once these giant telecom operators were fused to the government's data gathering operations, holding trusted surveillance partners accountable for privacy abuses (or much of anything else) increasingly became an afterthought. Even as technologies like deep packet inspection made it possible to track and sell consumer online behavior down to the millisecond. As the government routinely signaled that privacy abuses wouldn't be seriously policed, large ISPs quickly became more emboldened when it came to even more "creative" privacy abuses.Like the time Verizon Wireless was caught covertly modifying user data packets to track users around the internet without telling them or letting them opt out. It took two years for security researchers to even realize what Verizon was doing, and another six months for Verizon to integrate an opt out function. But despite a wrist slap by the FCC, the company continues to use a more powerful variant of the same technology across its "Oath" ad empire (the combination of AOL and Yahoo) without so much as a second glance from most news outlets.Or the time that AT&T, with full regulatory approval, decided it would be cool to charge its broadband customers hundreds of additional dollars per year just to protect their own privacy, something the company had the stones to insist was somehow a "discount." Comcast has since explored doing the same thing in regulatory filings (pdf), indicating that giant telecom monopolies are really keen on making consumer privacy a luxury option. Other companies, like CableOne, have crowed about using credit data to justify providing low income customers even worse support than the awful customer service the industry is known for.And again, this was considered perfectly ok by government regulators, and (with a few exceptions) most of these efforts barely made a dent in national tech coverage. Certainly nowhere near the backlash we've seen from this Facebook story.A few years back, the Wheeler run FCC realized that giant broadband providers were most assuredly running off the rails in terms of consumer privacy, so they proposed some pretty basic privacy guidelines for ISPs. While ISPs whined incessantly about the "draconian" nature of the rules, the reality is they were relatively modest: requiring that ISPs simply be transparent about what consumer data was being collected or sold, and provide consumers with working opt out tools.But the GOP and Trump administration quickly moved (at Comcast, Verizon and AT&T's lobbying behest) to gut those rules via the Congressional Review Act before they could take effect. And when states like California tried to pass some equally modest privacy guidelines for ISPs on the state level to fill the void, telecom duopolies worked hand in hand with Google and Facebook to kill the effort, falsely informing lawmakers that privacy safeguards would harm children, inundate the internet with popups (what?), and somehow aid extremism on the internet. You probably didn't see much tech press coverage of this, either.So again, it makes perfect sense to be angry with Facebook. But if you're deleting Facebook to protect your privacy but still happily using your stock, bloatware-laden smartphone on one of these networks, you're just trying to towel off in a rainstorm. The reality is that apathy to consumer privacy issues is the norm across industries, not the exception, and however bad Facebook's behavior has been on the privacy front, the telecom industry has been decidedly worse for much, much longer. And whereas you can choose not to use Facebook, a lack of competition means you're stuck with your snoop-happy ISP.We've collectively decided, repeatedly, that it's OK to sacrifice consumer privacy and control for fatter revenues, a concept perfected by the telecom sector, and the Congressional and regulatory lackeys paid to love and protect them from accountability and competition. So while it's wonderful that we're suddenly interested in having a widespread, intelligent conversation about privacy in the wake of the Facebook revelations, let's do so with the broader awareness that Facebook's casual treatment of consumer privacy is just the outer maw of a mammoth gullet of dysfunction.
|
![]() |
by Tim Cushing on (#3JQWA)
Across the sea in the UK, offensive speech is still getting people jailed. An obnoxious person who trained his girlfriend's dog to perform the Nazi salute and respond excitedly to the phrase "gas the Jews" is looking at possible jail time after posting these exploits to YouTube under the name Count Dankula. According to Scotland resident Markus Meechan, it was the "least cute" thing he could train his girlfriend's dog to do, apparently in response to her constant gushing about the dog's cuteness.Meechan's video racked up 3 million views on YouTube, but it really didn't start making news until local police started paying attention.
|
![]() |
by Timothy Geigner on (#3JQ73)
When it comes to content producers reacting to the pirating of their works, we've seen just about every reaction possible. From costly lawsuits and copyright trolling, to attempts to engage with this untapped market, up to and including creatively messing with those that would commit copyright infringement. The last of those options doesn't do a great deal to generate sales revenue, but it can often be seen by the public as both a funny way to jerk around pirates and as a method for educating them on the needs of creators.But Fstoppers, a site that produces high-end tutorials for photographers and sells them for hundreds of dollars each, may have taken the creativity to the next level to mess with those downloading illegitimate copies of their latest work. They decided to release a version of Photographing the World 3 on several torrent sites a few days before it went to retail, but the version they released was much different than the actual product. It was close enough to the real thing that many people were left wondering just what the hell was going on, but ridiculous enough that it's downright funny.
|
![]() |
by Timothy Geigner on (#3JPX6)
The internet ink has barely dried on Karl's post about an Uber self-driving vehicle striking and killing a pedestrian in Arizona, and we already have an indication from the authorities that the vehicle probably isn't to blame for the fatality. Because public relations waits for nobody, Uber suspended its autonomous vehicles in the wake of the death of a woman in Tempe, but that didn't keep fairly breathless headlines being painted all across the mainstream media. The stories that accompanied those headlines were more careful to mention that an investigation is required before anyone knows what actually happened, but the buzz created by the headlines wasn't so nuanced. I actually saw this in my own office, where several people could be heard mentioning that autonomous vehicles were now done.But that was always silly. It's an awkward thing to say, but the fact that it took this long for AVs to strike and kill a pedestrian is a triumph of technology, given just how many people we humans kill with our cars. Hell, the Phoenix area itself had 11 pedestrian deaths by car in the last week, with only one of them being this Uber car incident. And now all of that hand-wringing is set to really look silly, as the Tempe police chief is indicating that no driver, human or AI, would likely have been able to prevent this death.
|
![]() |
by Leigh Beadon on (#3JPJM)
This isn't the first time we've discussed this on the podcast, and it probably won't be the last — disinformation online is a big and complicated topic, and there are a whole lot of angles to approach it from. This week, we're joined by Renee DiResta, who has been researching disinformation ever since the anti-vaxxer movement caught her attention, to discuss what exactly it means to say social media platforms should be held accountable.Follow the Techdirt Podcast on Soundcloud, subscribe via iTunes or Google Play, or grab the RSS feed. You can also keep up with all the latest episodes right here on Techdirt.
|
![]() |
by Mike Masnick on (#3JPAW)
For the past few years, there's been a dedicated effort by some to get mandatory filters into EU copyright rules, despite the fact that this would destroy smaller websites, wouldn't work very well, and would create all sorts of other consequences the EU doesn't want, including suppression of free speech. Each time it pops up again, a few people who actually understand these things have to waste a ridiculous amount of time lobbying folks in Brussels to explain to them how disastrous the plan will be, and they back down. And then, magically, it comes back again.That appeared to happen again last week. EU Parliament Member Julia Reda called attention to this by pointing out that, despite a promise that mandatory filters would be dropped, they had suddenly come back:
|
![]() |
by Tim Cushing on (#3JP4K)
For the first time since the FISA court opened for national security business, the DOJ is considering declassifying FISA warrant applications. The documents are linked to the FBI's surveillance of former Trump campaign aide, Carter Page. Both sides of the political aisle have asked for these documents, which is something you'd think they'd have wanted to see before issuing their takes on perceived surveillance improprieties.Devin Nunes -- following the release of his memo -- sent a letter to the FISA court asking it to clear the warrants for public release. The court's reply, penned by Judge Rosemary Collyer, pointed out two things. First, the FISA court had never released these documents publicly, nor was it in the best position to do so. It is only tasked with determining whether or not surveillance is warranted and to what restrictions it must adhere. It does not have the innate power to declassify documents, nor can it arbitrarily decide what documents have gathered enough public interest to outweigh the government's perpetual demands for secrecy.The court did point out this release could be achieved much faster if Nunes directed his question to the DOJ, which does have the power to declassify its own investigation documents. It doesn't appear Devin Nunes has approached the DOJ but litigants in an FOIA lawsuit have, and they're looking at possibly obtaining the documents Devin Nunes requested from the FISA court.
|
![]() |
by Daily Deal on (#3JP4M)
The 2018 Arduino Enthusiast E-Book Bundle contains 10 ebooks of project based instruction to help you master all things Arduino. Pay what you want and get the Arduino Computer Vision Programming ebook where you'll learn how to develop Arduino-supported computer vision systems that can interact with real-life by seeing it. Beat the average price ($10 at the time of writing) to unlock access to 9 more ebooks. You’ll learn to create your own wearable projects by mastering different electronic components, such as LEDs and sensors. Discover how to build projects that can move using DC motors, walk using servo motors, and avoid barriers using its sensors. From home automation to your own IoT projects and more, these books have you covered.Note: The Techdirt Deals Store is powered and curated by StackCommerce. A portion of all sales from Techdirt Deals helps support Techdirt. The products featured do not reflect endorsements by our editorial team.
|
![]() |
by Mike Masnick on (#3JNVV)
Facebook -- and Sheryl Sandberg in particular -- have been the most vocal supporters of SESTA. Sandberg wrote a bizarre Facebook post supporting the horrible SESTA/FOSTA Frankenstein bill the day it was voted on in the House. In it, she wrote:
|
![]() |
by Karl Bode on (#3JN9N)
Cable providers like Comcast and Charter continue to quietly secure a growing monopoly over American broadband. A new report by Leichtman Research notes that the nation's biggest cable companies added a whopping 83% of all net broadband subscribers last quarter. All told, the nation's top cable companies (predominately Charter Spectrum and Comcast) added 2.7 million broadband subscribers in 2017, while the nation's telcos (AT&T, Verizon, CenturyLink, Frontier) saw a net loss of 625,000 subscribers last year, slightly worse than the 600,000 subscriber net loss they witness in 2016.A pretty obvious pattern begins to emerge from Leichtman's data, and it's one of total and absolute cable industry dominance:
|
![]() |
by Tim Cushing on (#3JMWX)
Police in Raleigh, North Carolina are using Google as a proxy surveillance dragnet. This likely isn't limited to Raleigh. Google harvests an astounding amount of data from users, but what seems to be of most interest to law enforcement is location info.
|
![]() |
by Glyn Moody on (#3JM7Q)
We've just written about calls for a key legal communications system to be open-sourced as a way of re-building confidence in a project that has been plagued by problems. In many ways, it's surprising that these moves aren't more common. Without transparency, there can be little trust that a system is working as claimed. In the past this was just about software, but today there's another aspect to the problem. As well as the code itself, there are the increasingly-complex algorithms, which the software implements. There is a growing realization that algorithms are ruling important parts of our lives without any public knowledge of how they work or make decisions about us. In Germany, for example, one of the most important algorithms determines a person's SCHUFA credit rating: the name comes from an abbreviation of its German "Schutzorganisation für Allgemeine Kreditsicherung", which means "Protection Agency for General Credit Security". As a site called Algorithm Watch explains:
|
![]() |
by Timothy Geigner on (#3JMJ2)
Violent video games have once again found themselves in the role of scapegoat after a recent spate of gun violence in America. After the Florida school shooting, and in the extended wake of the massacre in Las Vegas, several government representatives at various levels have leveled their ire at violent games, including Trump, who commissioned an insane sit-down to act as moderator between game company executives and those that blame them for all the world's ills. Amid this deluge of distraction, it would be easy to forget that study after study after study have detailed how bunk the notion is that you can tie real-world violence and violent games is. Not to mention, of course, that there has never been more people playing more violent video games in the history of the world than at this moment right now, and at the same time research shows a declining trend for deviant behavior in teens rather than any sort of upswing.But a recent study conducted by the Max Planck Institute and published in Molecular Psychiatry further demonstrates the point that violence and games are not connected, with a specific methodology that carries a great deal of weight. The purpose of the study was to move beyond measuring behavior effects immediately after short, unsustained bursts of game-playing and into the realm of the effects on sustained, regular consumption of violent video games.
|
![]() |
by Timothy Geigner on (#3JKTG)
Violent video games have once again found themselves in the role of scapegoat after a recent spate of gun violence in America. After the Florida school shooting, and in the extended wake of the massacre in Las Vegas, several government representatives at various levels have leveled their ire at violent games, including Trump, who commissioned an insane sit-down to act as moderator between game company executives and those that blame them for all the world's ills. Amid this deluge of distraction, it would be easy to forget that study after study after study have detailed how bunk the notion is that you can tie real-world violence and violent games is. Not to mention, of course, that there has never been more people playing more violent video games in the history of the world than at this moment right now, and at the same time research shows a declining trend for deviant behavior in teens rather than any sort of upswing.But a recent study conducted by the Max Planck Institute and published in Molecular Psychiatry further demonstrates the point that violence and games are not connected, with a specific methodology that carries a great deal of weight. The purpose of the study was to move beyond measuring behavior effects immediately after short, unsustained bursts of game-playing and into the realm of the effects on sustained, regular consumption of violent video games.
|
![]() |
by Karl Bode on (#3JKMG)
Despite worries about the reliability and safety of self-driving vehicles, the millions of test miles driven so far have repeatedly shown self-driving cars to be significantly more safe than their human-piloted counterparts. Yet whenever accidents (or near accidents) occur, they tend to be blown completely out of proportion by those terrified of (or financially disrupted by) an automated future.So it will be interesting to watch the reaction to news that a self-driving Uber vehicle was, unfortunately, the first to be involved in a fatality over the weekend in Tempe, Arizona:
|
![]() |
by Tim Cushing on (#3JKAV)
The old truism is in play again with the FBI's renewed CryptoWar: if X is outlawed, only criminals will have X. In this case, it's secure encryption. The FBI may not be trying to get encryption banned, but it does want it weakened. No backdoors, claims FBI director Chris Wray, just holes for the government to use at its pleasure. So, if the FBI gets it way, the only truly secure encryption will be in the hands of criminals… exactly the sort of people the FBI claims it needs weakened encryption to catch.
|
![]() |
by Daily Deal on (#3JK34)
Everyone knows you shouldn't be fumbling with your phone while you're driving. Now, you can stay safer with Muse, the Bluetooth device that brings the power and convenience of Amazon Alexa into your car. Just pair it with your smartphone, plug it into your car, and tell Alexa what she can do for you. Muse performs more than 30,000 Alexa skills from playing music and audiobooks to opening the garage door, ordering food, and more. It provides hands-free entertainment with support for Amazon Music, iHeartRadio, and others. The Muse is on sale for $59.99.Note: The Techdirt Deals Store is powered and curated by StackCommerce. A portion of all sales from Techdirt Deals helps support Techdirt. The products featured do not reflect endorsements by our editorial team.
|
![]() |
by Mike Masnick on (#3JJVS)
Earlier today, in discussing a long list of possible fixes for SESTA, we noted that the only one that even has a remote chance (i.e., the only fix that actually has the potential of being considered by the Senate) is Senator Wyden's amendment, which is designed to solve the "moderator's dilemma" issue by clarifying that merely using a filter or doing any sort of moderation for the sake of blocking some content does not automatically append liability to the service provider for content not removed. Senator Portman -- the sponsor of the bill -- has insisted (despite the lack of such language in the bill) that this is how SESTA should be interpreted. Specifically, Portman stated that SESTA:
|
![]() |
by Mike Masnick on (#3JJ07)
It appears that sometime this week (or even possibly today), the Senate is unfortunately likely to vote (perhaps by an overwhelming margin) for SESTA, despite the fact that it's a terribly drafted bill which no one can explain how it will actually stop sex trafficking. Indeed, it's a bill that many victims advocates are warning will not just make problems worse, but will put lives in danger. And that's leaving aside all of the damage it will do to free speech and tons of websites on the internet.Much of this could have been avoided if anyone in Congress were actually interested in understanding how the internet worked, and how to write a bill that actually addressed problems around sex trafficking -- rather than buying into a false narrative (pushed mainly by Hollywood) that the liability protections of CDA 230 were magically responsible for sex traffickers using the internet. Two academics who are probably the most knowledgeable experts on intermediary liability, Daphne Keller at Stanford and Eric Goldman at Santa Clara University, have each posted thoughts on how to "salvage" SESTA. If Congress were serious, it would listen to them. But that's a big "if."Let's start with Keller's suggestions that she helpfully put into a Twitter thread:
|
![]() |
by Leigh Beadon on (#3JGN5)
This week, following our coverage of the disturbing actions of a cop that led to a high-speed crash killing an infant, one commenter for some reason felt it was time to turn the blame around on the mother, suggesting the death must have been caused by her negligence. A reply from Alexander won first place for insightful:
|
![]() |
by Leigh Beadon on (#3JEPN)
Five Years AgoThis week in 2013, the Prenda situation positively exploded. As we awaited Monday's hearing, we learned more about Allan Mooney and saw Verizon get involved. Then, of course, the Prenda team itself didn't show up in court, meaning they escaped (at great cost) an absolutely crazy hearing with a very unhappy judge (written up for us by Ken White of Popehat fame). The judge ordered a second hearing and made it clear Prenda was expected to actually show up, while transcripts of John Steele's intimidating phone calls to Alan Cooper hit the docket, and Paul Duffy was scrambling to do some too-little-too-late damage control.Ten Years AgoThis week in 2008, following the death of HD-DVD, the next question was whether Blu-Ray would actually catch on in a big way. We now know it did, though early price hikes didn't help. But it certainly had nothing to fear from an ill-advised late entrant into the format wars. Meanwhile, having expressed displeasure with the agency's approach, EMI decided it wouldn't quit the IFPI, but would stop paying so much for its lawsuits against fans, while the IFPI was turning its sites on ISPs instead (and unsurprisingly triggering the Streisand effect when trying to block websites).Fifteen Years AgoThis week in 2003, we watched the steady emergence of video game development courses at colleges, had an early discussion about Americans using the internet to find alternative news sources, and perhaps didn't realize just quite how revolutionary Amazon's focus on web services would be. There were still five big record labels but they were looking to merge (while betting a tad too heavily on enhanced CDs), McDonald's became the second huge chain to start offering free wi-fi, and we looked at the debunking of a hoax story about a cyberwar virus targeting Iraq (though that idea wouldn't seem so crazy seven years later when we all learned about Stuxnet in Iran). Also, Techdirt got chosen by Forbes as one of the five best tech blogs.
|
![]() |
by Glyn Moody on (#3JDD9)
We recently wrote about an important judgment from the EU's top court, the Court of Justice of the European Union (CJEU). The ruling said that that corporate sovereignty provisions included in trade deals between the EU's member states were illegal. Significantly, the logic behind that decision suggests that any form of investor-state dispute settlement (ISDS) -- the official name for the corporate sovereignty framework -- even in trade deals involving countries outside the EU, would be forbidden too. Christina Eckes, professor of European law at the University of Amsterdam and director of the Amsterdam Centre for European Law and Governance, believes that the implications of the CJEU ruling are even broader.Eckes says that in the wake of the judgment, serious doubts hang over the investment chapter in the Canada-EU trade deal, CETA, which has still not been ratified by all EU member states yet -- a process that is necessary before it comes into force definitively. In fact, Belgium has explicitly asked the CJEU to rule on the legality of the Investor Court System (ICS) in CETA, which is the modified version of corporate sovereignty that supposedly addresses its flaws. As a result, a ruling on whether CETA's investment chapter is legal is definitely on its way, and could have major implications for CETA and its ratification. However, Ecke points out that there is something called "EU loyalty", which:
|
![]() |
by Tim Cushing on (#3JD1K)
Buzzfeed has obtained files the NYPD never wanted the public to see. This isn't the result of a protracted public records battle, but rather the work of an anonymous whistleblower. Presumably, those further up the chain of command are already familiar with the department's disinterest in holding officers accountable, so there's no whistleblowing outlet there. Also, presumably, the Civilian Complaint Review Board's hands are tied and it cannot hand out disciplinary reports for officers never formally disciplined. So, leak it is. And what a leak it is.
|
![]() |
by Timothy Geigner on (#3JCSY)
In all of our conversations about video game piracy and the DRM that studios and publishers use to try to stave it off, the common refrain from those within in the industry and others is that these cracking groups are nearly nihilism personified. Nothing is sacred to these people, goes the mantra, and they care nothing for the gaming industry at all. If the gaming industry is destroyed, it will be because of these pirate-y pirates simply not giving a damn.This notion is belied by the story of Crackshell, makers of indie spinoff of the Serious Sam franchise called Serious Sam's Bogus Detour, and Voksi, an individual that runs a game-cracking ring. Voksi has been featured in our pages before as one of the few people out there who has been able to consistently defeat the Denuvo DRM, helping propel the software's precipitous fall from grace. If a game developer and a game-cracker seem to be natural enemies, it will come as a surprise to you that they have recently teamed up to try to resurrect Bogus Detour from the bin of failure.The whole story is useful for debunking the notion that these pirate sites and those that run them are pure venom for the game industry, but it's particularly useful to hear how this relationship came to be.
|