Lets just face the facts, the far-left has won Academia, Hollywood, sports, and has infiltrated many corporate boards at big companies. They have infiltrated the public education system and have indoctrinated younger folks that America is inherently evil and racist at its core. There are many polls that support this.
There is an entire generation of Americans that do not believe in the American dream and believe that the country is uniquely evil and bad. These younger people screaming for the American "system" to be torn down are going to hold high positions of authority some day where they will have the power to do so.
In my opinion, it's only a matter of time before the country as we know it ceases to exist.
I think this happens within the next 30 years.