Address the problem, not the symptoms. The problem is at the platform level, so regulate platforms like Facebook. Search engines simply search what they scrape.
* If someone feels consequences of an offensive 2007 tweet, just delete it. Platforms should be required to make it easy to delete content.
* If someone is scared of saying controversial things, avoid platforms that require real identities. There are plenty of anonymous platforms.
* Uphold social media to the same standards of traditional media, requiring truth and propagation of redactions and corrections. Libel and slander are well-established concepts.
* Demand discretion from friends. In college, my group had a strict "no-camera" rule when it came to embarrassing or unlawful shenanigans. My parent's generation had the same rule.
Hiding search results does not address the problem. If someone posts a photoshop of "Eric driving drunk" on Twitter, I want that post promptly removed... the search results are just a symptom.
So someone posted a real photo of 'Eric driving drunk' to the web. Eric did not approve of that in any way and wants it gone. The someone could not care less. What is the solution in your model?
Because Google makes it easy to find. Nobody cares about some rant about you by an anonymous customer, or just your ex partner when it is on page 73. But if it is on page 1 and the other party refuses to take it down you just feed an endless army of lawyers.
> The problem isn’t that the content exists- it’s that google surfaces it.
Removing it at the source removes it from google.
It's like saying the problem isn't that there is a negative story about you in The New York Times from 2007, the problem is that ISPs allow their customers to read that story in The New York Times. Obviously the "problem" is the story -- which you may have no legitimate right to prevent people from reading -- and if you have a legitimate complaint (i.e. libel) then you should have to take it up with The New York Times and not Comcast or Google.
The reason people want to go to Google instead is that they know Google doesn't have a strong enough incentive to stand up for the victim of the censorship. If you go to the source they may refuse to take it down and force you to adjudicate the matter in court where they can argue their side in front of a judge. If you go to Google, economically they have to take it down because nobody is paying them to hire lawyers to spend the hours it takes to make accurate legal determinations and they would go out of business taking on that role without compensation for seven billion people.
Probably the problem description should be something that is legal in most places but would cause real problems for anyone caught doing it.
so let's say Eric was out drunk with his friends and they walked in a sex shop and picked up some dildos because hah hah this shit is funny we are so drunk and then Eric took some silly photos cuddling the dildo or mimicking putting it in his ass and someone posted it to the internet and now eric would like it gone.
Now Eric's pictures on the internet are actually sort of funny so they have gotten some exposure, and so Eric finds he can't keep his job at the local macho place of employment - I don't know why, maybe he's not good at talking shit to anyone who talks shit to him so everybody picks on him about his stupid funny photos.
So he gets fired or quits because can't handle the harassment, and is sick of going out and hearing hey it's dildo guy. So he moves away. Nobody in his new town know's he's dildo guy, but then he goes out one night with this girl, and somebody exclaims omg it's that dildo guy, hey look at this everybody I got me some google skills, I saw this hilarious picture one time.
Now on the one hand Eric has given the world a lot of (unpaid) entertainment as dildo guy, but on the other hand when he commits suicide because he is emotionally unconstituted to go through life being called out every now and then as dildo guy then he makes people feel bad. So in order to allow Eric to make his stupid drunk dildo hugging photo and not ruin the rest of his life for something embarrassing but totally legal let's just give him the right to remove the stuff from google. That way, when he moves from his old town where there are lots of people remembering the whole dildo picture situation, in his new town people didn't know him when it was fresh and his stupidity is somehow 'forgotten' thus preserving the ability that people have had throughout history of moving from one area because their reputation at that area had become to problematic to continue. Google makes sure the reputation follows.
That's a really poor metaphor because the fridge is the problem - you don't want people seeing that content because it's inherently "bad". If it's not on Google search, it becomes harder to find... but it's still there, still discoverable, and now you have no awareness or control over the situation.
I guarantee Yandex and Baidu will not respect western delusions of security through obscurity.
Well maybe Eric shouldn't have "driven drunk"? Im not sure why search providers should be held accountable for protecting the carefully curated (false) reputations of individuals.
All of the "right to be forgotten" arguments seem to devolve to "people shouldn't be able to know any data points that would negatively effect my social standings".
Even more troubling is deep fake technology that's run rampant. Should a serious director never hire Emma Watson again because she's such a prominent porn actress? Of course "we all know" that's fake. But how would we ever know Eric's stuff is fake?
Contact the platform to demand removal. The porn industry has SOPs for dealing with this problem, I am not a lawyer but I imagine that process can be generalized and legally enforced
If that someone shared it on their own website, hosted out-of-country, on their own hardware... well, I can't do anything about that but i also wouldn't care. It's clearly non-authoritative and won't get SEO traction by itself.
A prospective Eric could engage in some formal process to require distinctive detail be added to that 'accusation' so that it does not defame the Eric which did not drive drunk.
So you would rather only the rich can afford to protect themselves? RTBF makes it possible for ordinary citizens who can't afford fancy lawyers to fight big corporations to seek redress. This is one of the reasons its so important.
> So you would rather only the rich can afford to protect themselves?
Does Europe not have the concept of a court-appointed lawyer?
> RTBF makes it possible for ordinary citizens who can't afford fancy lawyers to fight big corporations to seek redress.
No it makes it possible for criminals to hide their criminality. To be very blunt, if I were an employer, I don't want to hire a murderer. I don't care if he/she was convicted and served their time. Once you have committed murder, I will not hire you. I will not deal with you in any way. Hiding/obfuscating true information from me is wrong. Criminality is public record. It should be easily accessible.
> Does Europe not have the concept of a court-appointed lawyer?
Not sure you understand how court-appointed attorneys work. First off, you can't use one to sue someone else since you can only get a court-appointed attorney when you're being charged in a criminal suit.
Second, court-appointed attorneys suck. They are underpaid, overworked, and unable to properly do their jobs as is.
Requiring legal action would indeed create a scenario where only the rich bring such cases to trial.
RTBF comes about because someone did exactly that, and the publisher ignored the court rulings.
I'm uneasy about some RTBF cases, but where you have a person making prolific publications across a variety of legal jurisdictions and ignoring the legal rulings (which happens in cases of stalking and harrassment, for example) it's impossible for the victim (and they do suffer real harm) to get justice other than asking search engines to de-index the attack pages.
Search engines only show what's posted by other people, so while they're a convenient target, they aren't the actual bad actor.
Our legal system tends to attack actual bad actors, not convenient targets. Unless the Internet is involved. The way RTBF and DMCA work, search engines bear 100% of the cost, and people don't ever go after the actual bad actors.
By the way, in the US, if Tabloid X publishes Eric's photo, that's 100% legal thanks to the First Amendment, as long as the photographer agrees. Eric has no part. Attacking search engines on behalf of Eric, in the US, not so cool. Europe doesn't have free speech, so, no problem.
> Europe doesn't have free speech, so, no problem.
Europe definitely has free speech (all the countries I know of anyway, Europe has a lot of different countries). There is just a different definition of what exactly is free speech and what is something else (some racist things are not considered free speech).
Nope it's not, it's an EU law and EU !== Europe. Europe has 51 countries while the EU only has 28. Just like Mexico is not in the US, a ton of European countries are not in the EU.
> The way RTBF and DMCA work, search engines bear 100% of the cost, and people don't ever go after the actual bad actors.
RBTF, maybe, but how is the DMCA this way? Certainly not the takedown notice/counternotice provision, which doesn't create new liability, only a special shield from any pre-existing liability.
> how is the DMCA this way? Certainly not the takedown notice/counternotice provision, which doesn't create new liability, only a special shield from any pre-existing liability.
The problem is that the shield has value to the search engine in excess of the cost to the search engine (but not the cost to the censorship victim) of removing the information. It reduces their risk, even when the risk is low because they would be likely to win, and removes the cost of having to litigate the issue even if they do win, so the search engine takes the deal and then executes ~all the notices even when they're bogus.
Which is a direct cost to the censorship victim compared to handling it the way Section 230 does it, and an indirect cost to the search engine because it has them paying to process the removal of legitimate information and reduces the value of their service to customers -- just not enough of an indirect cost to give them the incentive to refuse, because the brunt of the cost is on the third party being censored.
The DMCA causes huge costs for search engines because of all of the bogus takedowns, which people can send without any consequences. Congress could fix that aspect of the law, but has chosen not to.
> The DMCA causes huge costs for search engines because of all of the bogus takedowns
The search engine could ignore all takedowns and be in exactly the same situation as it would be without the DMCA; the safe harbor provision isn't a mandate on them, it is a benefit to them. They deal with takedowns because the cost of doing so is less than the cost of copyright liability they would have without the DMCA, which means the DMCA is saving them money, not imposing a cost.
> The search engine could ignore all takedowns and be in exactly the same situation as it would be without the DMCA; the safe harbor provision isn't a mandate on them, it is a benefit to them. They deal with takedowns because the cost of doing so is less than the cost of copyright liability they would have without the DMCA, which means the DMCA is saving them money, not imposing a cost.
You're treating the safe harbor and notice and takedown as indivisible when they obviously aren't. Conditioning the safe harbor on notice and takedown is a huge cost compared to the alternative used in CDA 230.
> You're treating the safe harbor and notice and takedown as indivisible when they obviously aren't.
They obviously are both part of the DMCA, so you can't say that the DMCA imposes costs based on the notice and takedown condition for the safe harbor, because ignoring that condition leaves the host in the same position they would be in without the DMCA.
You can say that the notice and takedown requirement reduces the cost savings of the safe harbor compared to the hypothetical policy of an unconditional safe harbor, or one with alternative sets of conditions, but that's a very different claim than the DMCA imposing costs.
> They obviously are both part of the DMCA, so you can't say that the DMCA imposes costs based on the notice and takedown condition for the safe harbor, because ignoring that condition leaves the host in the same position they would be in without the DMCA.
The safe harbor and the anti-circumvention rules are both part of the DMCA too, but it's silly to argue that the anti-circumvention rules don't impose net costs because if you average them together with the safe harbor it comes out somewhere near neutral. They don't cease to be divisible just because they were enacted at the same time. Otherwise you could justify anything by just finding something which is as good as the target thing is bad and lumping them together on the same side of the scale.
Er, ok, I formed my opinion as to the relative liability cost as the executive in charge of DMCA of a search engine that raised $63mm. I dislike appeals to authority on HN as much as the next person, but you sure seem confident at knowing something about small search engines that only a small number of people actually have experience with.
And if you've never gotten a DMCA takedown from Perfect 10, you probably don't understand the true terror of the DMCA process.
> Europe doesn't have free speech, so, no problem.
No-where[1] has the US's extremist version of freedom of speech. Europe has a different form of free speech, and in this example Eric's right to be forgotton probably doesn't trump Someone's right to publish true information.
I guess there'd be some judicial attempt to balance these two rights: Does Eric pose a continuing risk to the public from drunk driving? Is Eric a public figure who's claimed to never have driven drunk? Was this a one off event that happened many years ago, never repeated? This would be something courts are able to decide.
You missed my actual point, which is that the search engine is just showing what a website published. RTBF attacks the search engine in that case, not the actual website.
With all of the cost pushed on the search engine. Which is a huge barrier to entry, yet Europe says it wants more search engines.
BTW, search engines don't "publish" information in the US sense of the word. The way Russia forces Yandex to self-censor is that Yandex is liable for everything they show to users. That's 'publishing' in the US sense. Newspapers have publishers, and the publisher is the person you sue if you think the newspaper has libeled you.
Meanwhile, Europe (mostly) doesn't consider RTBF to be censorship because it only involves censoring search engines and not newspapers. Except that people are filing RTBF against newspaper site search, too.
> * If someone feels consequences of an offensive 2007 tweet, just delete it. Platforms should be required to make it easy to delete content.
...and I read that tweet in 2007, and quoted it in a blog post on my obscure, low traffic blog. Google will still find it.
> * Uphold social media to the same standards of traditional media, requiring truth and propagation of redactions and corrections. Libel and slander are well-established concepts.
What about the case where the damaging content is true? A key thing here is that what is acceptable changes over time. Something that can be a life ruining social faux pas today may have been pretty normal 20 years ago, and many people today won't accept the "oh, that was normal back then" explanation.
We used to be able to avoid these problems because it took effort to dig up records from 20 years ago, and from low circulation sources like local newspapers.
So, for example, if you did stupid things in your home town that ended up in your high school newspaper, and then 20 years later were applying for a job in a city in another state...the employer would probably not find that high school newspaper, even if you were applying for a fairly sensitive job in an industry like finance.
That's because to find things like that they would have to actually send someone to visit your high school library and comb through their archives of the high school newspaper. That's just too expensive to do routinely for job applicants, except for the most sensitive positions.
Nowadays, all that stuff ends up online from the start, and it is cheap and easy to find.
> * Demand discretion from friends. In college, my group had a strict "no-camera" rule when it came to embarrassing or unlawful shenanigans. My parent's generation had the same rule.
All it takes is one person in the group to slip up, or for you to overlook one third party who is not part of your agreement and who can see you. So really, the rule has to be don't undertake embarrassing or unlawful shenanigans. (And as I noted above, standards for what is embarrassing or unlawful can change over time, so it really needs to be don't do anything that could conceivably become embarrassing or unlawful in the next 50 or so years).
Essentially we used to balance privacy vs. public access kind of automatically, due to the limitations we had in information storage, indexing, and retrieval. We've removed most of those limitations, so the balance has been lost.
> If someone feels consequences of an offensive 2007 tweet, just delete it. Platforms should be required to make it easy to delete content.
“Just delete it”.
Except have you ever personally been in the situation that you needed something removed? I have and:
1) You might not even have the password to every random account you created in the past, nor the e-mail addresses that you used when you created those profiles, nor maybe even remember what e-mail address you used for each of them.
2) Turns out that there are a lot of sites out there that copy and preserve a lot of random data from other sites. They do so without regards to the ToS of the site you originally posted to. They do not care about copyright. They do not respond to personal requests for removal of data. They do not respond to DMCA notices. They are outside of the jurisdiction of the country you live in and as are their hosting providers. And even if they are cooperative, there are so many of them that reaching out to all of them and following up on the removal will require much much more time and energy than what you have available.
So then the best you can do is delete what you can and submit the rest for removal from Google.
“Well you shouldn’t have posted it in the first place if you didn’t want it to be public”, right? No, it’s not that simple!
The things you post today can be taken out of context and misinterpreted by someone in the future in ways you would never have imagined today.
We keep posting comments, pictures, videos, creating profiles, liking and sharing posts and information, but most of us rarely delete any of it. As the amount of data increases, so does the room for cherry picking data about you to build up an image of you that while true in the sense that all of it are things you posted, wildly misrepresents what kind of person you are, and on top of this misrepresentation and even more inaccurate image can be painted.
If you had any idea what it feels like to have that happen to you, I think you would want to be able to have some of that information at the very least removed from search results.
Once it’s gone from search, it’s gone from the public eye. And if you are lucky you are able to erase the bits of information that ties the data to you so that even if the data resurfaces in the future it is no longer connected to you, or at least not as directly.
Furthermore, when you are working on having information removed you should first make a list of all of the information, then have it removed from Google ASAP so that 1) it gets harder to find as soon as possible and 2) so that the information is not retained in the publicly available caches of search engines after it’s been deleted from the source sites.
Beyond that, for the information that you could not get deleted but which you were able to have removed from search results, some of it will eventually disappear all together on its own because of bitrot (hardware failures, data management errors, sites going out of business, etc) and some of it will probably stick around forever.
But like I said you want as much of it removed as possible and you want the rest of it to be hard to find and you want as much of it as possible to lose connection to you. And achieving that requires the cooperation of the search engines in removing results.
But Google does cooperate with DMCA. The difference there is that that content deemed in violation of copyright is actually illegal for anyone to distribute; legal responsibility extends all the way to the website owner.
Unless content falling under "right to be forgotten" is ruled privileged and not legal for public distribution, any artificial roadblocks to their discovery will merely present a business opportunity for their circumvention.
It is that simple. Just the same as in real life. If you're "saying, likeing, sharing" things that "don't represent who you are" maybe you should take some time for introspection rather than demand the world to follow your narrative.
Information you, yourself, post publicly to the internet is public. Just the same as if you got up in Times Square and shouted it using a megaphone.
Information that is factually accurate that is posted publically on the internet isn't under your domain to censure. This falls heavily in the camp of "freedom of my speech not freedom of your speech" that seems so common here.
Information that is posted by others that isn't factual is already covered by libel and slander laws so doesn't fall under here.
The internet should be, and for the sake of truth has to be, immutable. The "right to be forgotten" is the right to break any concept of online reputation.
If you want to control your narrative, maybe don't post thoughtlessly and publicly.
> If you want to control your narrative, maybe don't post thoughtlessly and publicly.
I am not posting thoughtlessly. What I am saying is that there is just a million ways that anything can be interpreted in the future that you have no way of foreseeing.
Even a silence can be interpreted in such manner. Should we be able to retrospectively edit the past if it somehow concerns us? Seriously, given how stirred up things are, it's not unrealistic to imagine that someone would want you to "feel guilty" of not posting something in the past.
This is not a tech problem (at all). This is a social problem that had existed since forever, but now uncovered by technology's availability. And if the agreed solution to the "world's gone mad" is to grant one legal ability to alter other's memories, then the world's truly gone mad.
The web should be considered WRITE ONLY. We should NOT remove content.
However, that doesn't mean that annotating content or making it more clear that a different (Firstname Lastname) did something might be a better response. For example, I have never created an account on facebook, linkedin, twitter, etc. I refuse to give any one company a defacto monopoly over social discourse and interaction; those tools belong on OPEN, FREE (libre+beer), well defined minimum interoperable standard platforms. Currently that's email; it really sucks, but the standard is well defined, anyone /can/ implement it without barriers, and everyone is forced to federate to at least some degree.
>The web should be considered WRITE ONLY. We should NOT remove content.
It isn't and it has never been.
Given that different countries have different laws and yet all claim to universal application of their laws can you imagine how many people would be killed because of this?
Atheists taken off planes form Arab Emirates flights because they posted about god. Homosexual activists getting assassinated by Russia. Europeans being arrested in the US because of the difference in age of consent.
Immutable data structures have their place in programming, and immutable communication has it's place in society, but neither are appropriate for all use-cases.
* If someone feels consequences of an offensive 2007 tweet, just delete it. Platforms should be required to make it easy to delete content.
* If someone is scared of saying controversial things, avoid platforms that require real identities. There are plenty of anonymous platforms.
* Uphold social media to the same standards of traditional media, requiring truth and propagation of redactions and corrections. Libel and slander are well-established concepts.
* Demand discretion from friends. In college, my group had a strict "no-camera" rule when it came to embarrassing or unlawful shenanigans. My parent's generation had the same rule.
Hiding search results does not address the problem. If someone posts a photoshop of "Eric driving drunk" on Twitter, I want that post promptly removed... the search results are just a symptom.