Literally every single person of all political stripes will see this and agree with this statement.
Smart people can be stubborn and would dismiss facts sometimes too
Not just sometimes. Much of the time.
Not just much of the time. Often.
Not just often. Pretty much majority of the time.
indoctrinated people don’t care if you think they’re smart. in fact, “smart” is a pejorative to them, unless the adjective is applied to dear great leader, who’s the only one allowed to claim the title, and the only one from whom information can be accepted as true
indoctrination indeed. GOP is a fucking cult
This is not entirely true though. Beliefs and opinions are heavily influenced by a lot of factors. Even educated people are not free from such errors. Like the backfire effect (Nyhan and Reifler (2010)): situations where people become more entrenched in their views when confronted with contradictory evidence.
Other studies have found that when presented with data, individuals with more education can sometimes be more divided in their beliefs, particularly when the topic is politically charged. For instance, some educated individuals may use their knowledge to selectively interpret data in ways that support their pre-existing views, a phenomenon known as “motivated reasoning.” Confirmation bias relates to that. This has been observed in areas like climate change, where political and ideological factors heavily influence opinions. (See for example: https://doi.org/10.1073/pnas.1704882114 )
In other words, no matter how educated or smart you are, you can still fall into ignorance and stubbornness. The key is to train your ability to think critically—especially when it comes to your own beliefs and opinions. Doing so can help you become more aware of biases and avoid common pitfalls in cognitive decision-making.
Receiving an education, and being wise are two very different things.
Where meme
Apparently I stumbled into Facebook somehow.
Oh my god that’s exactly the vibe I have been getting from Lemmy this past month. Add some minions to the feed from my grandmother and you got just Facebook with more communists.
I think this makes it seem a lot more black and white than it really is. Defaulting to information that agrees with your word view is a natural human bias. We all suffer from it and it’s important to actively try and work against that
Defaulting to information that agrees with your word view is a natural human bias.
In general, perhaps, but in the face of conflicting facts?
Everyone has their own set of facts. That’s the basis for their world view. Doesn’t mean those facts are pertinent to the question at hand but upsetting their entire worldview is not something people allow easily.
And that’s human nature. We’re a social species. We belong to tribes and depend on our tribe for survival. If we could drop our worldview like a load of dirty laundry then we’d be walking away from our tribes and dying.
Ask anyone who has had profound political disagreements with their family. It’s enormously painful. While you can drop a position here or there in an election, it’s not at easy to drop your family.
Everyone has their own set of facts.
If you mean their own configuration of facts, I agree. If you mean “things they believe”, I disagree that those are “facts”.
Perhaps I was being too vague but the key to my point is this:
Doesn’t mean those facts are pertinent to the question at hand
What I’m talking about is facts about people’s situation in life. Their friends, their family, their community. It’s well-known that many people will believe the medical advice of a close family member over that of a doctor. Does this mean that they (through family connections) have access to some secret medical knowledge?
No.
What it means is that a person’s instincts to trust their family and close friends — members of their tribe — make it difficult for them to accept contradictory information from their doctor (a stranger). You can extend this issue to almost any domain of expertise (apart from those in which the person in question has had formal training). This is why conspiracies, myths, and other falsehoods can be so difficult to dispel from the outside of communities: the people who believe these things are not going to take the word of strangers who try to contradict their friends and family.
And so what I mean about people having different facts is this: their relationships and communities are different. Their whole worldview depends on their ability to trust the people they’re closest to. So when it comes to the question of whether to believe a falsehood (myth/conspiracy/scandal) or to reject it and in so doing reject their own community (with catastrophic results for their life), it should not be a surprise that they choose to believe a falsehood.
And in that I agree. But I read the OP as saying something different.
As an example: Trump is a rapist. That’s a fact. How is that a fact? Well, his victim detailed the rape, produced evidence to corroborate it, and a judge and jury agreed, fining him 85 Million dollars for saying he didn’t rape the victim. Was he tried, convicted, and sentenced under a charge of rape? No. Statute of limitations and other reasons prohibited that. But the “fact” remains.
Now, the evidence of that fact is: the corporate news reporting of it AND the trial AND the transcripts which include witness testimony. Can all of those things exist for something that isn’t a fact? In extreme examples, yes, but it’s very rare. So as best as anyone can determine, this is a fact about a political figure.
A trump supporter will not believe it. Just like that. No reasoning, no plausible counter-argument, just - no. Because that is against their belief system. A straightforward rejection of a simple proven fact.
I’m saying I think that’s qualitatively different from a person altering their belief about the relatively unknowable - what is “god”, the purpose of life, how health is maintained - all of which have varying degrees of provable empirical fact but which are malleable to one’s family, society, culture, etc.
Reality: 2+2=4
Trump: 2+2=5
MAGAts: 2+2=5!
Reality: no, it really, really doesn’t.
MAGAts: I don’t subscribe to your facts! 2+2=5!
That’s. what I think the OP is describing.
Right, but now we need to ask ourselves how a person could get to the point where they don’t believe the media reporting all of this and instead they choose to believe Trump.
It starts with their community and it ends with a total collapse in their trust in public institutions, including the media. Then, if they and all their friends and family have begun to believe that the media (what they might call “left wing media”) are engaged in a conspiracy to disenfranchise themselves and their community (by trying to disqualify their chosen candidate through alternative means) it becomes easier to see why they would reject the facts.
It’s really a serious problem for democracy in the U.S. (but also in other western countries) and it didn’t begin nor doesn’t end with Trump. It’s a sign of major fault lines through society.
It’s a good question. However, I think it’s been answered before by about 30 or 40 years.
The answer is that media consumption and propaganda are often exactly the same thing and we don’t limit, police, suspect, or explain media consumption at all. That’s usually considered to be a good thing, but I think we see in the age of TikTok that it’s gone way too far, and we need to have basic media literacy as an elementary school-level learning.
That’s something that none of trumps supporters have had. I think what’s working in that situation (the right wing blogosphere, etc.) is some bastardized and weaponized version of “media literacy” that is strictly focused on not believing standard authority, and only believing the “new” authority.
Which is itself a very old ploy.
I’d much rather confirm whether new information is accurate before adjusting my world view. Not all new information is equal.
This was my thought. Everyone challenges the source when they don’t like info and honestly there is a ton of bad info out there. When it comes to research I like to know where the funding came from.
The problem is, this is wrong. Most people won’t change their views easily. We instinctively downgrade evidence that disagrees with us and upgrade that which reinforces our beliefs.
Ironically, “smart” people can be FAR worse at this that stupid people. Just ask anyone who’s tried to do IT work for a doctor. Smart people are able to build more elaborate mental constructs to explain contradictory evidence.
This comes to a particular head in science. Scientific papers are written in a weird way. It’s always in the 3rd person, with as much personality taken out as possible. This helps when someone critiques it. Disagreements are with the paper, not the author. This is backed up by a LOT of training at university level. Even so, scientists are still prone to hanging onto outdated ideas far too long. These are people who are undoubtedly “smart” by any reasonable measure.
I get what you’re saying, but assuming you’re talking about medical doctors, they’re a bad example. I know three doctors well and they’re all dumber than a sack of hammers. Becoming a doctor doesn’t require much intelligence, it requires the ability to stay in school long enough (and being able to tolerate gross stuff from other people’s bodies).
What do you call someone who got all Ds in medical school? Doctor.
It’s actually part of my point.
Doctors are intelligent, you have to be to absorb the amount of information they are required to learn. However, it’s specialised intelligence. Being smart about medicine doesn’t make you smart about other things.
It’s like we all have a pool of base intelligence. We can then pour it into various moulds. The traditional intelligent professions are often just reliant on a large amount of specialised intelligence. This actually robs them other other forms.
It’s easy, when you can demonstrate high intelligence, in a difficult field, to assume you are intelligent across the board. A stupid person can often know they are stupid and so can compensate. An “intelligent” person can be blindsided by their weaknesses.
Its amazing what 2 smart people in a room can rationalize.
That’s not true!
No, people do not change their opinions based on new facts. It’s important to not think of it like that because even non “indoctrinated” people we would all consider rational work like this. If you really want to change people’s opinions on things, especially things that are important, you need to know how our brains work to get there and you shouldn’t think less of people for not changing their minds immediately. Studies have shown our rationality is not a means of making decisions but a means of explaining our decisions. I highly recommend this Vsauce video on the topic. It’s a great watch. https://youtu.be/_ArVh3Cj9rw
I first saw this video in early 2021 after spending a lot of time trying my best to show people they were wrong about COVID misinformation and election misinformation. It was a nice epilogue to that period of my life.
I highly recommend this Vsauce video on the topic.
Okay, but you should be warned, it won’t change my opinion.
show people they were wrong about COVID misinformation and election misinformation
I do think people forget how much contrary information is in the discourse when they try and make an argument and discover that their audience is unmoved. It’s like talking to a 12 year old about legal drug use after they’ve received all their information from 90s Sitcoms and the D.A.R.E. program’s most insane police officers.
So much of our understanding of the world is a composite of our prior accumulated experiences and inputs. A single contrary data point will only provoke skepticism, as it collides with a bulwark of rebuttals. Shaping someone’s views takes time, and works best on people who haven’t spent years/decades inoculating themselves to the message. That’s why propaganda exists in the first place. A steady repetition of claims and data points in favor of a particular outlook will leave people resistant to opposing views.
That said…
There are some good notes in the video, but so much of the article seems to want to rehash general logical fallacies, without addressing the underlying nature of the claims. This gets us to an argument from fallacy wherein you attempt to dismiss a claim entirely because of bad logic. “The sun rises in the east because God wills it!” is a fallacious claim, but I would not look for the sun to rise in the West as a result.
Also…
This video was created in partnership with Bill Gates and was inspired by his new book “How to Avoid a Climate Disaster.”
You can’t help grapple with the fact that Microsoft is one of the biggest modern contributors to misinformation, thanks to their massive investment in AI. And, mumble mumble Gates was on the Epstein Jet mumble mumble, which isn’t so much a refutation of the video but a note on the motivations of the authors.
Aside from the kid-sex stuff, Epstein is known for circulating some broadly ignorant and socially detrimental views on international finance, neoliberal economic reforms, and foreign policies. It can be argued that the Extended Mind Theory of Consciousness is pseudo-pscyhological drivel (particularly in how this video attempts to glamorize it, raise the stakes, and tie it into the fucking Fermi Paradox). This reads like the kind of overly dramatic pronouncements he and his friend Steven Pinker are best known for.
Which, again, isn’t even to say the core ideas are wrong. But the people pitching them… It’s like getting a lecture on the moral hazard of warmongering from Dick Cheney and Bill Kristol. Or a condemnation of sexual assault from Donald Trump.
Funny. I was in a group of people (was, in past tense) that made fun of me because I was the first one to change minds if new evidence showed that I was wrong. They saw it as a weakness, as if my ideas were wrong because I was able to change them if I was proved wrong.
I guess this helps explain why I “was” and not “am” a part of that group anymore.
It’s not surprising though considering reasoning is more of a social defense mechanism than anything.
I always find that people that are unflinchingly rigid in their beliefs always seem to be incredibly happy.
I believe that this stems from their inability to grasp the concept that the things that you don’t want to be true sometimes are, and the things you wish were, sometimes aren’t.
Ignorance truly is bliss.
Drugs apparently are a close second.
Nah, this sounds like bs to me
I did what you see there
Not a meme but ok
Something I’ve been able to do somehow that I’m very proud of is to train myself to hold space for, and to appreciate cognitive dissonance — I’ve found that trying to quickly clear away the discomfort of it leads to doubling down on beliefs or opinions, which isn’t productive. I used to do that a lot, because I had internalised an image of myself as someone whose views and beliefs are consistent and coherent (not like all those other stupid people). I cringe to think of past me, but I suppose that’s good in a way, because progress.
Most of my progress has come from remodelling of my world view, which is a messy and lengthy process. Sometimes I have to sit with the discomfort of cognitive dissonance for a while before I can understand it and resolve it.