Techno-fascism

Yep, enforcing stuff from 1996 screams big government.

Hey man, however you want to convince yourself that more government control over private companies and private citizens is a good thing just because your boy is the one in office, go on ahead. Just don't pretend like you hate big government. We know that isn't true.
 
Hey man, however you want to convince yourself that more government control over private companies and private citizens is a good thing just because your boy is the one in office, go on ahead. Just don't pretend like you hate big government. We know that isn't true.
1996

Enforcing laws and Acts on the book have nothing to do with big government.
 
1996

Enforcing laws and Acts on the book have nothing to do with big government.

There is nothing explicitly on the books to this effect as you keep insisting. You're only talking about an interpretation of this Act--one which notably lacks broad support (and no legal precedent) for obvious reasons. But by all means keep acting like we're the ones who dont understand.
 
There is nothing explicitly on the books to this effect as you keep insisting. You're only talking about an interpretation of this Act--one which notably lacks broad support (and no legal precedent) for obvious reasons. But by all means keep acting like we're the ones who dont understand.
We will see how the lawsuits go using the same thing.
 
Well well well, we've got anti-bill of rights crybabies on both sides of the aisle on this issue. What a surprise

Screenshot_20190726-154921_Facebook.jpg
 
The insider claims that while working at Google, he found “a machine learning algorithm called ML fairness, ML standing for machine learning, and fairness meaning whatever they want to define as fair.” (6:34) The implication being that Google employees actively take steps to ensure that Google search results yield anti-conservative content rather than what a neutral search algorithm would. Unfortunately, what a “neutral” algorithm would look like is not discussed.

Although we’re living in the midst of a new tech-panic, we should remember that questions about bias in machine learning and attempts to answer them are not new, nor are they merely a concern of the right. Rep. Alexandria Ocasio-Cortez (D-NY) and the International Committee of the Fourth Internationalhave expressed concerns about algorithmic bias. Adequate or correct representation is subjective, and increasingly a political subject. In 2017, the World Socialist Web Sitesent a letter to Google, bemoaning the tech giant’s “anti-left bias” and claiming that “Google is “’disappearing’ the WSWS from the results of search requests.”

However, despite the breathlessness with which O’Keefe “exposes” Google’s efforts to reduce bias in its algorithms, he doesn’t bring us much new information. The documents he presents alongside contextless hidden camera clips of Google employees fail to paint a picture of fairness in machine learning run amok.

One of the key problems with O’Keefe’s video is that he creates a false dichotomy between pure, user created signals and machine learning inputs that have been curated to eliminate eventual output bias. The unnamed insider claims that attempts to rectify algorithmic bias are equivalent to vandalism: “because that source of truth (organic user input) has been vandalized, the output of the algorithm is also reflecting that vandalism” (8:14).

But there is little reason to presumptively expect organic data to generate more “truthful” or “correct” outputs than training data that has been curated in some fashion. Algorithms sort and classify data, rendering raw input useful. Part of tuning any given machine learning algorithm is providing it with training data, looking at its output, and then comparing that output to what we already know to be true.

Take a recent example from Wimbledon. IBM uses machine learning to select highlight clips, their algorithm’s inputs include player movements and crowd reactions. While crowd reactions can provide valuable signals, they can also be misleading. “An American playing on an outside court on 4 July may get a disproportionate amount of support, throwing the highlight picking algorithm out of sync.” While we expect the crowd’s cheers to be driven by their appreciation of a player’s skill, they may also cheer to celebrate the appearance of an American on Independence Day. If IBM wants to identify moments of skillful play rather than the mere appearance of Americans on the court, they must reduce the relative importance of audience applause in their algorithm, debiasing it.

Despite the insider’s claim that “they would never admit this publicly,” (9:45) Google is quite open about its efforts to prevent algorithmic bias. The firm maintains a list of machine learning fairness resources, including an extensive glossary of terms describing different sorts of bias, and sample code demonstrating how to train classifiers while avoiding bias. These public resources are, frankly, far more extensive, and reveal more about Google’s efforts to prevent machine learning bias, than anything in the latest Veritas video.

The fact that Google News is not an organic, unfiltered search product (11:30) is not news either. Google’s news content policies are open to the public, and Google gives further public guidance to publishers as to what their algorithms prioritize in news pages.

The “demonstration” of Google search bias that follows, relying on autocomplete suggestions rather than actual search results, are far from “undeniable.” O’Keefe first types “Hillary Clinton’s emails are” into Google’s search bar and notes that Google does not continue to autofill the search. Without actually conducting a search, they conclude that “Google is suggesting that people do not search for this term” and that “its not even worth returning any results for” (15:48). But they haven’t actually conducted a search. If they had, Google’s search would have returned millions of web pages concerning Clinton’s use of a private email server as a government employee. When one uses a more generic query, with less punctuation, typing “clinton ema” into the search bar, Google autosuggests “clinton emails on film” and “clinton emails foia”, and surfaces results from Judicial Watch and the Daily Caller.


Misleading Project Veritas Accusations of Google "Bias" Could Prompt Bad Law
 
Fascism/fascist have to be the most overused words in the English language currently.

People freaked out on me because I had a thread about Trump being ideologically authoritarian but it's fine to call a search platform outright "fascist" because they might be biased.
 

VN Store



Back
Top