I agree with most of that. But 230 allows moderation acting "in good faith." No accountability. I like other countries' approach that sites should aim to remove any illegal activity, and their authority stops there. Big Tech should not have more power than the government.In a perfect world, we would all have an equal voice. Don't get me wrong. I scoff at Parler but I also am disgusted with big tech freezing them out. I took Twitter off my phone in protest of all this. They are ****ing up big time. But I don't want to kill the internet just to spite twitter. I'd rather live with the problem until we find a smarter solution than removing section 230 protection, which websites of all sizes need, not just big tech.
If a company consistently uses its “platform” to push narratives that align with its political leanings then that seems to violate the spirit of section 230.And? I don't disagree but they have a right to handle content as they see fit just like every other social media outlet
Trump's use of twitter was a big part of his loss
I agree with most of that. But 230 allows moderation acting "in good faith." No accountability. I like other countries' approach that sites should aim to remove any illegal activity, and their authority stops there. Big Tech should not have more power than the government.
Rumor has it that Twitter began using Amazon Web Services roughly 2 weeks prior to Parler being deplatformed.Yeah, which presents another concerning avenue for fixing this. If we're going to crack down on websites that moderate without good faith, who is going to determine what that means and who is going to decide when violations have occurred. A corrupt federal government that I don't trust, that's who.
It seems like maybe the best avenue is anti-trust law, which I generally do not like, but it is a better method than anything that's been proposed. Apple store, Google Play, and Amazon web services all colluding to kill Parler seems like something that could fall under antitrust. But IDK for sure.
Since the pandemic that Trump didn't create, people have lost jobs. Jobs were increasing nicely under Trump to that point.A State Department report found that no more than 50 jobs would be required to maintain the pipeline. The majority of the jobs you must be referring to were seasonal construction work, which is somewhere between 10K and 11K jobs... which would have lasted between 6 to 8 months. The impact of this on the labor market has been grossly exaggerated by the right. Per the Bureau of Labor Statistics, since Trump took office in January of 2017, approximately 4 million Americans have lost their jobs. Cutting the Keystone Pipeline is a drop in the bucket to the real problems facing this country right now.
If a company consistently uses its “platform” to push narratives that align with its political leanings then that seems to violate the spirit of section 230.
Why can’t companies be found to be operating outside of 230 and held accountable (loss of legal protection) when applicable?
Not sure why anyone would have a problem with that.
It would address the issue on an individual company/platform basis, not a repeal of 230.
Hard to argue acting in good faith with examples like this one JMOI don't see how it violates the spirit of 230.
It says they can't be held liable for "any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected"
"Or otherwise objectionable" really leaves it ambiguous enough to include things like political beliefs. I'm not sure I'd change that anyway.
Why don't we just leave it alone. It seems the system we've got has created a lot of jobs and wealth and now you butt hurt conservatives want to kill it. I thought you guys were supposed to be pro-business.
Only when it serves their interests at the moment and that is always subject to change. Trumpism and hypocrisy at its finest.Why don't we just leave it alone. It seems the system we've got has created a lot of jobs and wealth and now you butt hurt conservatives want to kill it. I thought you guys were supposed to be pro-business.
If H1N1 from 2009 had impacted the economy to a similar extent, Donald Trump would have been on Twitter every day completely blaming Barack Obama for a pandemic that Obama had not created. This is one of many inconsistent patterns from Trump supporters. You expect better behavior from Democrats, than you ever expected from Trump. Is it wrong to blame Trump for the loss of jobs from COVID? Sure. Should a Democrat have any qualms with doing it? Hell no.Since the pandemic that Trump didn't create, people have lost jobs. Jobs were increasing nicely under Trump to that point.
Dems treat jobs as a political tool. Shut down businesses for months, then when numbers are at their highest, open everything back up to make the new administration look good. They should be ashamed playing with people's livelihoods like that.
Hard to argue acting in good faith with examples like this one JMO
Read the article. They refused to remove it.What, because they f'd up by not removing child porn in one instance they should be held liable for what anyone posts anywhere on their site? There are probably millions of tweets per day. That doesn't make any sense.
I'm for holding twitter accountable for knowingly hosting child pornography on their site, if that helps.
Read the article. They refused to remove it.
Did you read the article?Maybe somebody at Twitter just made a mistake?
Have you seen the images? I don't want to see them, so I haven't explored this, but just because someone thinks something is pornographic doesn't make it pornographic. If it's graphic, I get it. If it's just him with his shirt off in this terrible situation that isn't captured by the image, I can see how twitter would pass on censoring this. Which is it?
And beyond all that, I would bet this is an outlier. Twitter has a consistent history of not having child pornography on their platform. You're looking at the .0000001% that's out there and demonizing Twitter for it.
Twitter has a consistent history of not having child pornography on their platform.
Twitter refused to remove child porn because it didn’t ‘violate policies’: lawsuit
By Gabrielle Fonrouge
Twitter refused to take down widely shared pornographic images and videos of a teenage sex trafficking victim because an investigation “didn’t find a violation” of the company’s “policies,” a scathing lawsuit alleges.
The federal suit, filed Wednesday by the victim and his mother in the Northern District of California, alleges Twitter made money off the clips, which showed a 13-year-old engaged in sex acts and are a form of child sexual abuse material, or child porn, the suit states.
The teen — who is now 17 and lives in Florida — is identified only as John Doe and was between 13 and 14 years old when sex traffickers, posing as a 16-year-old female classmate, started chatting with him on Snapchat, the suit alleges.
Doe and the traffickers allegedly exchanged nude photos before the conversation turned to blackmail: If the teen didn’t share more sexually graphic photos and videos, the explicit material he’d already sent would be shared with his “parents, coach, pastor” and others, the suit states.
Doe, acting under duress, initially complied and sent videos of himself performing sex acts and was also told to include another child in his videos, which he did, the suit claims.
Eventually, Doe blocked the traffickers and they stopped harassing him, but at some point in 2019, the videos surfaced on Twitter under two accounts that were known to share child sexual abuse material, court papers allege.
Over the next month, the videos would be reported to Twitter at least three times — first on Dec. 25, 2019 — but the tech giant failed to do anything about it until a federal law enforcement officer got involved, the suit states.
Doe became aware of the tweets in January 2020 because they’d been viewed widely by his classmates, which subjected him to “teasing, harassment, vicious bullying” and led him to become “suicidal,” court records show.
While Doe’s parents contacted the school and made police reports, he filed a complaint with Twitter, saying there were two tweets depicting child pornography of himself and they needed to be removed because they were illegal, harmful and were in violation of the site’s policies.
A support agent followed up and asked for a copy of Doe’s ID so they could prove it was him and after the teen complied, there was no response for a week, the family claims.
Around the same time, Doe’s mother filed two complaints to Twitter reporting the same material and for a week, she also received no response, the suit states.
Finally on Jan. 28, Twitter replied to Doe and said they wouldn’t be taking down the material, which had already racked up over 167,000 views and 2,223 retweets, the suit states.
“Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time,” the response reads, according to the lawsuit.
“If you believe there’s a potential copyright infringement, please start a new report. If the content is hosted on a third-party website, you’ll need to contact that website’s support team to report it. Your safety is the most important thing, and if you believe you are in danger, we encourage you to contact your local authorities.”
In his response, published in the complaint, Doe appeared shocked.
“What do you mean you don’t see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down,” the teen wrote back to Twitter.
He even included his case number from a local law enforcement agency, but still the tech giant allegedly ignored him and refused to do anything about the illegal child sexual abuse material — as it continued to rack up more and more views.
Two days later, Doe’s mom was connected with an agent from the Department of Homeland Security through a mutual contact who successfully had the videos removed on Jan. 30, the suit states.
“Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children,” states the suit, filed by the National Center on Sexual Exploitation and two law firms.
“This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children.”
The disturbing lawsuit goes on to allege Twitter knowingly hosts creeps who use the platform to exchange child porn material and profits from it by including ads interspersed between tweets advertising or requesting the material.
Early Thursday, Twitter declined comment to The Post but later in the day, reversed course and sent a statement by email.
“Twitter has zero-tolerance for any material that features or promotes child sexual exploitation. We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy, a Twitter spokesperson wrote.
“Our dedicated teams work to stay ahead of bad-faith actors and to ensure we’re doing everything we can to remove content, facilitate investigations, and protect minors from harm — both on and offline.”
Did you read the article?
How could anyone confirm the accuracy of this statement?
I guess people that use twitter and people that follow the world. If there were lots of it going on, we'd definitely be hearing about it.
I read the article. Can you maybe highlight the part that answers my question? I don't think it's there.