An intriguing aspect of watching American elections revolves around the point at which a swathe of folks have arranged enough data to have a rigid bias about a candidate or an issue. They are no longer working at this point with or toward verifiable facts–coherent data sets that can be checked for consistency–but with rigged facts–incoherent data sets that are often incomplete or contradictory but confirm the bias.
Rigged facts have an ominous air about them, and those that espouse them do not respond well to critical engagement. Yes, such people will talk about being “critical” and even demand the need to use information to make sure folks know what is “really going on.” But the crucial point lies at how the assertions are built from cherry picked data and statically held together by force of will.
Critical thinking, on the other hand, only works so long as the collection of data flows toward coordinating verifiable facts into valid–or at the very least strong–informative arguments. The biggest problem with those who begin coordinating rigged facts into weak–and very often invalid–arguments is that they have given up critique for criticism and argument for arguing. Engagement with such folks almost always goes southward for the simple reason you are not challenging the information itself you are challenging their reality. As Ortega y Gasset says in Historical Reason,
We have ideas but we inhabit beliefs. Man [sic] always lives in the belief of this or that, and on the basis of these beliefs — which to him are reality itself — he exists, behaves, and thinks. Thus even the most skeptical of men is a believer, and profoundly credulous. (1984  21)
All of us, even the critical thinker, inhabits a world by how we believe. Critical thinking, as an aspect of philosophizing, helps us survey that world we inhabit. But the tools that allow for that surveying are like any other tools put to bad use: it can build the unnecessary, make the beautiful grotesque, and even become an instrument of violence. For this reason, philosophizing requires that we tell the difference between those who are rigidly biased and those who have strong conviction. The distinction rests on whether someone has made rigged facts a cornerstone of the beliefs they inhabit or whether they are putting forward verifiable facts as an idea they have which they want to share. In the former, you would be a trespasser; in the latter, you could be a neighbor or roommate.
This is why we are hearing a big increase in xenophobic rhetoric as well as watching the devolution to panic among those who decided to build their reality on sand. It is also why both the left and the right have become ultra concerned with the purity of their beliefs.
…studies show that attempts to refute false information often backfire and lead people to hold on to their misperceptions even more strongly.
In 1979, Charles Lord performed a seminal piece of research that revealed when you show someone factual, scientific evidence that they are wrong, they react badly. They will only accept the evidence that fits their pre-existing views. Lord called this effect “confirmation bias.” There have been hundreds of studies since, all finding the same results: when you argue using facts and evidence, people generally reject or discount your evidence. Instead of changing their minds, most will dig in their heels and cling even more firmly to their originally held views. Brendan Nyhan of Dartmouth and Jason Reifler of the University of Exeter have also documented an even more alarming tendency, which they call “the backfire effect.” In their study, correcting people actually increased their misperceptions.