Strategy (Obsidian Notes): Difference between revisions

From Destiny Wiki
Jump to navigation Jump to search
(→‎Debate pervertry: add more debate pedophile tactics)
 
(5 intermediate revisions by the same user not shown)
Line 183: Line 183:
=== You're being so weird/obsessed! ===  
=== You're being so weird/obsessed! ===  
When someone does something that you do exactly and then you accuse them of being weird when they do it, e.g. making clips/compilations of what the other person does and then the other community creates something in response.
When someone does something that you do exactly and then you accuse them of being weird when they do it, e.g. making clips/compilations of what the other person does and then the other community creates something in response.
=== Death by a thousand anecdotes ===
When someone is incapable of pushing back factually against a heavily data-driven argument and instead relies upon personal (or popular) anecdotes, or unrelated data to make their point.
=== Tragedy of the commons sense ===
Basically any time someone invokes common sense because they're unable to explain or justify their position in any other way.
=== "I don't even care" ===
=== "The Webster Warrior" or "The Oxford Offensive" ===
If you are having a debate around a topic that requires a certain level of complex contextual understanding, but when confronted and not being able to provide satisfactory justifications for your arguments, you appeal to a dictionary definition.
=== Analogy Allergy ===
When a person refuses to engage with a hypothetical, not because they've demonstrated the hypothetical is inapplicable to the current disagreement, but because the hypothetical is "wacky" or "crazy."
# [https://www.youtube.com/watch?v=YqIaiQ-aK_s&ab_channel=TheCrucible Andrew Wilson vs Dave Smith: Is Libertarianism better than Christian Populism?]


== Interview strategy ==
== Interview strategy ==
Line 232: Line 247:


I consider this a subtle use of the noncentral fallacy, which I know you hate.
I consider this a subtle use of the noncentral fallacy, which I know you hate.
# When taking notes during a debate, and responding to everything the other person, the other person rarely responds to all of my points. Instead of wiping my entire board, I should remind the person (and the audience) that they failed to respond to the points I've made.
# Need to call out people more who live in stories or anecdotes or unrelated examples/data.
# Restate when a person doesn't answer a question and move on.
#* Restate what the other person said and ask them if that was an answer to your question.
[[Category:Obsidian Notes]]

Latest revision as of 03:05, 26 June 2024

People's beliefs

Aesthetic vs function

There are many functional things possessing a desirable aesthetic due to their function. There is an unfortunate side effect that many will adopt the aesthetic of a particular thing rather than the particular thing itself, causing a distortion that allows for one to seek the additive benefits of a particular thing without incurring any of the intellectually rigorous costs associated with it.

Some examples of this might be:

  1. “Non-partisanship,” or putting oneself above pressure to believe everything a certain political group believes. Very rarely does the “non-partisan” individual have a way of independently generating unique, consistent and coherent thoughts, but rather they find a way to criticize unimportant things that appear to be part of their political in-group while still aligning 100% with every important, core belief. A great example of this is being highly critical of Donald Trump’s promiscuity or fiscal habits while being in lock-step with everything more integral to Donald Trump’s associated beliefs, such as his historically divisive rhetoric, his claims of mass voter fraud, or his claims that the entire political and corporate world are a “swamp” waiting for him to drain them.
  2. “Rationality,” or pretending to be someone possessing a high level of “reason” or critical analysis. Almost never does a person claiming they are “rational” or “truth-seeking” does the person actually exhibit any qualities fundamental to those people, rather they tend to align almost entirely with certain other groups of people and they simply wield this “rationality” against someone else that belongs to the “wrong” side.
  3. “Skepticism” is never applied evenly to anything. People are almost never skeptical from a place of rational criticism, but rather because the skepticism is required to maintain their cognitive consonance.
  4. “Uncomfortable truths” are almost always incredibly comfortable. They wear the guise of discomfort in order to smuggle in some incredibly comfortable thing. Great examples of this might be telling people the “uncomfortable truth” that some huge system is against them, e.g. the FDA or WHO or IMF, but this is actually incredibly comfortable to most people because it helps them organize the world and provides an external entity by which you can blame significant, societal problems on. Almost none of these “uncomfortable truths” require you to change your mind or your actions in a way that you wouldn’t already want to, but instead serve to motivate you to moving down a path you already find highly desirable.

Constellation of beliefs

Rather than starting from some epistemically humble foundation and individually constructing beliefs, people inherit constellations of beliefs from the social groups they find themselves in, often due to social pressures that push them towards these belief systems. These constellations of beliefs will share some similar, hollow chains of logic, but they are ultimately absorbed at the “applied” level for most people and often have little to no thought put into them before they’re accepted as a person’s own beliefs. These beliefs are always somewhat abstract as well, meaning there is no immediate corrective mechanism that exists in the real world to counter them, as they always make claims that are either just out of our ability to perceive them or will come at some point in the future (though they rarely, if ever, manifest).

Here is an example from 2024:

  • The “anti-establishment” constellation
  • Supports Donald Trump
  • Believe Donald Trump is being unfairly prosecuted
  • Believes in the “elite” ruling class
  • Thinks the 2016 election was stolen
  • Doesn’t trust any of the mainstream media
  • Doesn’t trust the vaccine
  • Thinks COVID was largely overblown and all forms of lockdown were unnecessary
  • Supported Brexit
  • Doesn’t trust most US Institutions, especially intelligence agencies or anything related to health
  • Believes Andrew Tate is being unfairly targeted
  • Opposes most “woke” coded things, e.g. transgenderism, affirmative action, feminism, etc.

Evidence of the existence of these constellations is supported by the following:

  1. High degree of congruency between many people who are part of these constellations.
  2. High likelihood that a person possessing a single belief in this constellation will possess many of the others.
  3. Parts of the constellation appear to be highly contradictory to each other.
  4. “Webbing” - when someone hears that you have a belief that’s part of another constellation they’re aware of, they will instantly assume that you have a plethora of other beliefs immediately, since they are projecting their own adherence to a constellation onto you.

It becomes incredibly difficult to break anyone out of any given constellation because there is always some underlying transcendental entity that can explain every single part of why the constellation is true. Even evidence that seems to be in opposition to any particular belief is reinterpreted through the transcendental entity and becomes evidence for the particular belief.

Examples of this in relation to the earlier anti-establishment constellation:

  • "I don’t trust the mainstream media, but the New York Post was just an example of someone trying to do the right thing for once!"
  • "I don’t trust any part of our government, and Donald Trump isn’t technically a part of the government! That’s why they hate him so much, because even though he’s the president he’s not like them!"
  • "I don’t believe anything the intelligence agencies say, except for when they say things that I agree with, like when some say COVID probably came from a lab!"
  • "I never trust special investigators, like Mueller, but I do trust some like Durham, but I definitely don’t trust people like Comey when he refused to press charges against Hillary, but I definitely did trust him when he gave that scathing review of her email use!”
  • "Donald Trump is an amazing leader, it’s not his fault every person he nominated or worked with ended up backstabbing him! He’s such a good leader that they had to come together like never before to stop him!”
  • "The election was rigged, even though Trump’s own nominated lawyers, campaign managers and white house staff said otherwise, and it was only rigged in the states he lost, even if the means to rig it were done months ahead of the actual election and went completely unchallenged!”

Humans are not truth-seeking machines

Humans treat truth as instrumental, meaning it is a tool to be utilized to further someone's personal happiness. When the tool is no longer serving that purpose, it will be abandoned in favor of another tool. In certain domains, especially relating to science, engineering and medicine, “truth” most likely needs to adhere pretty closely to reality for the truth to yield any certain of tangible benefit. It’s not very beneficial for one to believe drinking mercury will cure their disease nor pouring gasoline onto a candle will make it burn brighter, so these beliefs are quickly and immediately dispelled in favor of something that more accurately approaches what is actually true.

It’s important to recognize this because humans who believe their senses naturally and intuitively lead them to “truthful” things rather than “pleasurable” things are likely to confuse the two when there is incentive to believe the truth about a thing. Without any immediate mechanisms to align “truth” with “utility’, people can wildly diverge from truthful things in fantastic ways that lead to catastrophic consequences down the road. Examples of things might be things like vaccine skepticism, climate change denialism, or the mistrust of our voting process. Since these beliefs inherited from our social groups lack any immediately corrective mechanisms, it’s easy for them to fester and manifest in long term negative ways.

Normative confusion with applied positions

Rather than building beliefs from a logical foundation, people tend to inherit constellations of beliefs from their social circles due to misaligned incentives that reward beliefs for social validation rather than factual accuracy. Since there was never any work done to arrive at applied positions through some logic, applied positions are quickly and easily confused with moral positions.

Here are some examples expressed as syllogisms, where the first proposition is a moral claim, the second proposition claims to remedy an injury to said moral claim, and the conclusion demands that we follow through to cure the moral deficiency listed in the first proposition.

Formula:

P1. Moral claim.
P2. Something that purports to support the moral claim.
C. A demand to follow P2 to satisfy P1.

Examples:

P1. Homelessness should not exist in the United States.
P2. Free public housing would solve homelessness.
C. Therefor, we should adopt free public housing.

P1. Civilians should not be targeted in war.
P2. A charge of genocide would force Israel to stop killing civilians.
C. Therefor, we should charge Israel with genocide.

In the prior examples, if you disagree with P2, you necessarily disagree with the conclusion, but your interlocutor will assume that you also disagree with P1 as it becomes intrinsically tied to the conclusion since P2 is considered unquestionable. P2 is most likely unquestionable as it is inherited as part of the constellation of beliefs, meaning the individual has done no work to actually investigate P2, nor do they have any incentive to do so.

This means that if you disagree that we should adopt public housing, it’s because you believe homelessness should exist in the United States. Or if you believe we shouldn’t charge Israel with genocide, you believe civilians should be targeted in war.

This form of irrationality is incredibly destructive due to how quickly someone is to ascribe to you an entire constellation of your beliefs, and likely they will ascribe to you all of the same moral reversals as they have in the singular example you’ve disagreed with.

Here’s an example constellation for someone who’s progressive:

  • Systemic racism is real, and is a huge driver of racial inequity
  • Healthcare should be free to all
  • Education should be free to all
  • Trans rights are unquestionable for all ages
  • Housing should not be commodified
  • Profit is immoral

Any one of these beliefs could be extracted like the syllogisms above, and subsequently reversed on you if you disagree with any part of their construction:

P1. Workers shouldn’t be exploited.
P2. Eliminating profit would eliminate most worker exploitation.
C. Profit should be eliminated.

P1. Everyone is entitled to free healthcare.
P2. Medicare-for-all would provide free healthcare for everyone.
C. We should have Medicare-for-all.

A denial of the second premise on either of these would lead to a denial of the conclusion, but most people would see it as a denial of the first premise.

Skeptics rarely are

When individuals become skeptical of a particular thing, it’s not usually because it is failing to live up to some internal rational rubric, it’s usually because it’s failing to satisfy some other personal desire or it’s causing some amount of cognitive dissonance.

Evidence can be found of this in multiple ways.

  1. Skeptics are rarely skeptical in an even way across multiple entities, e.g. a skeptic might be highly suspicious of something reported by NPR, but will implicitly trust anything reported by RT.
  2. Skeptics will apply their skepticism selectively across the same class of media entities, e.g. a skeptic will tell you that you can rarely if ever trust the mainstream media, but they will hold onto stories like the Hunter Biden Laptop reporting from the New York Post uncritically as gospel.
  3. Skeptics will apply their skepticism selectively across the same media entity, e.g. if a particular media entity is reporting something they disagree with, it’s further evidence that the entity is not to be trusted, but as soon as that entity is reporting something they do believe in, that media entity suddenly becomes a worthy source.
  4. Skeptics almost never move towards an area that requires further investigation or requires a critical re-examination of their own ideas or beliefs, instead they almost always utilize their skepticism to move to areas of greater comfort, e.g. if a report comes out showing an mRNA vaccine as performing well, it’s easy to be skeptical of that report and hold onto the belief that the vaccines don’t work, because this reaffirms our anti-establishment world-view.

Social incentives dictate beliefs

People rarely apply a consistent, rational formulation for whatever their beliefs are. This is not because people are stupid or lack the ability to do so, but rather because there is little to no incentive for doing so. Truth is simply a means to enrich our lives. For the majority of political issues, it is easier to change the truth than it is to change one’s mind. There is great incentive in believing in particular untrue things, especially when there is little practical consequences for having an incorrect understanding of an issue.

Examples of certain social pressures:

  • One might have family members belonging to a certain religion. “Leaving” that belief system might entail being completely disowned by one’s family, which would lead to significant loss of financial and social support.
  • One might have an audience that demands to hear a certain thing, and giving a take counter to what that audience believes might cause one to incur a significant financial or reputational loss.
  • One might belong to social groups in school or on social media that demand adherence to certain sociopolitical beliefs and betraying those beliefs (or not showing strong support for them) might result in a person being thrown out of said social group.
  • For many of the social and political beliefs we hold and value today, being “factually correct” offers little to no reward. Whether or not you know the factual details about a particular shooting, war, catastrophe or medicine is very rarely relevant since that “belief” is not factually tested on a day-to-day basis.

Let’s analyze a controversial event, the Rittenhouse trial concerning the 2020 shootings in Kenosha, Illinois, and think about the social incentives on both sides, assuming you part of a social structure that is pro BLM:

  • Rewards for having factually accurate beliefs:
    • Assuming you’re not someone who’s job relies on getting the facts correct, like a prosecutor, lawyer, journalist or someone working in the system, there is no reward for being factually correct about anything related to Rittenhouse.
  • Punishment for having factually accurate beliefs:
    • If you are in any social circles that a staunchly pro-BLM, expressing certain factual accuracies about the events of that night might lead to you being ostracized from your groups, or webbed as a white supremacist.
  • Rewards for having factually inaccurate beliefs:
    • Depending on what you get wrong, people might celebrate you for having incorrect beliefs, especially if you’re over-stating the wrong doings of Rittenhouse of under-stating the actions of the assailants. Common examples of wrong beliefs being rewarded: “Rittenhouse was spraying his magazine into the crowd!” or “The person chasing Rittenhouse only threw a plastic bag, how scary!”
  • Punishment for having factually inaccurate beliefs:
    • Assuming your inaccurate beliefs are cutting against Rittenhouse, you’re unlikely to face any repercussions (or even corrections) about having inaccurate beliefs relating to Rittenhouse. A gentle correction is likely the most you’d experience.

Given the incentive structures for and against certain beliefs, there’s little reason to expect someone to strive for factual accuracy and every reason to believe people would try to adhere to their social groups as much as possible.

Countering irrational beliefs

Offering someone in the audience of my interlocuter some $100 if they are capable of rationally explaining how you can justify a belief in some inherently contradictory position. Ensure that those people are organically a part of the community, and have been for years.

Debate pervertry

When you accuse the other side of saying a thing, and instead of allowing the other side to explain or clarify, you immediately launch in on an attack of that particular thing.

Show, don't tell.

When you constantly refer to other books, speakers, videos, etc... ("sources of authority") that you claim to be familiar with, without contextualizing or demonstrating an understanding of any of that underlying material. You are substituting an appeal to authority for an actual argument.

Any man who must say "I am the king" is no true king.

When you intentionally say the name of your interlocutor incorrectly.

See also: Norman Finkelstein

Fallacy of the Single Cause

The clown mirror

Your opponent never seeming to be able to summarize your position, ever. You constantly having to criticize or refuse to accept ANY other characterization of your position.

BITING LEADING QUESTIONS = SUPER GOOD FAITH

The lazy gardener

Which could also be referred to as the Let's not get in the weeds strategy.

Oftentimes, when the opposition is lacking a thorough understanding of what's being said, they will oftentimes attempt to obfuscate away from crucial details by claiming that they "don't want to get into the weeds" or "don't want to get into technicalities", even though these particularly technicalities might be essential to justifying or attacking a particular argument.

Warning signs that you may be talking to a "Lazy Gardener" -

  • "let's not get into the technicalities"
  • "let's not get bogged down"
  • "too much legalese"
  • "this is just semantics"

The deaf preacher

When you refuse to engage with the argument, and you just make big sweeping moral/virtue signal statements while avoiding any factual response to what was previously said.

Debate edging

When you constantly stack descriptive claims one over another that are clearly leading into a certain prescription that you never actually verbalize, causing other people to attack you on a prescriptive claim you've never made and allowing you to refute their arguments without addressing the obvious implications of what you're saying.

Occam's mallet

When someone suggests that simply because a party benefited from something (or because they had something to gain something failing) that there must have been some sort of cohesive plot or scheme in order to bring about that particular thing, often involving highly subversive and unethical means.

Moral dodgeball

Accusing someone of holding a different core value simply because you disagree with an applied position that they have.

Robinhood complex

Always siding with the less powerful entity in any conflict, simply due to the amount of power both sides are capable of exercising.

The braveheart

When someone poses a question about how a person should respond in a situation, where it's obvious that the person would need to act in a certain way to protect their interests, but the more privileged debater responds with "Personally, I wouldn't do this..." instead of acknowledging the need for the affected party to respond and protect their interests in a particular way.

You're being so weird/obsessed!

When someone does something that you do exactly and then you accuse them of being weird when they do it, e.g. making clips/compilations of what the other person does and then the other community creates something in response.

Death by a thousand anecdotes

When someone is incapable of pushing back factually against a heavily data-driven argument and instead relies upon personal (or popular) anecdotes, or unrelated data to make their point.

Tragedy of the commons sense

Basically any time someone invokes common sense because they're unable to explain or justify their position in any other way.

"I don't even care"

"The Webster Warrior" or "The Oxford Offensive"

If you are having a debate around a topic that requires a certain level of complex contextual understanding, but when confronted and not being able to provide satisfactory justifications for your arguments, you appeal to a dictionary definition.

Analogy Allergy

When a person refuses to engage with a hypothetical, not because they've demonstrated the hypothetical is inapplicable to the current disagreement, but because the hypothetical is "wacky" or "crazy."

  1. Andrew Wilson vs Dave Smith: Is Libertarianism better than Christian Populism?

Interview strategy

Students and "children" (under 25)

The goal is to interview people as easily as possible in order to gather as wide a variety of opinions and conversations as possible. In order to accomplish this, that means I likely need to:

  • Be as non-confrontational as possible
  • Try to avoid “debating” or “critically drilling” to the point of discomfort
  • Review the conversations I do record in the most positive and constructive manner possible (this includes not allowing my stream/community to dogpile interviewees)

Exceptions for any of these rules can probably be made for people that are:

  • Exceptionally aggressive (entire negative encounter NEEDS to be caught on camera in order to justify any negative or critical comments about the person)
  • People who opt for a more aggressive/critical conversation (doesn’t necessarily mean we are attacking them hardcore, but we can be a bit more critical of their talking points)
  • People who are public figures or “leaders” at their given protests

Perception

Things within my control:

  • The way I dress
  • The way I speak
    • The types of examples or analogies
    • How condescending or understanding I act
  • The sources I cite
    • The authorities I appeal to
  • Content I choose to publish

Things outside of my control

  • Versions of me put out by other people
  • Hasan, Vaush, etc. saying things about me, or friendly sources like Erudite, Pisco, etc.
  • "Curated" content of me put out by other people
  • X clips (especially groypers), TikTok stitches, small YT videos, etc. can be good or bad

Rhetorical strategies

Epistemic/eaesthetic flaws

  • "Just asking questions"
    • Predict the contour of the conversation
    • Ask questions, and force them to take a hard position.
  • Diction: Big and small terms "BIG anything" "Deep state"
    • Ask who the deep state is, force them to relate them to the topic at hand.
  • Characterization - different tragic hero tropes, for instance. The creator of the conspiracy theory will often portray themselves in this way.
    • Ask more and more questions about the person's character.
  • "Do it yourself" - stuff conveyed in newsletters, pamphlets, "man on the street" kind of vibe.
    • Ask why they made the choices they did when it comes to presentation.
  • Lots of "shape-talk", charts and stuff, mapping
    • ????

Nathan mentioned in the debate (paraphrasing): "if you are separating people who conveniently have a different race, and you give them less rights while they have a different race, that is apartheid." You replied by saying it was about citizenship, not race; ergo, not apartheid.

A good analogy here is something Scott Alexander has written about: maybe negative impacts on black people are not because the criminal system is racist, but because it is biased against poor people. The problem with saying "it is racist because it disproportionately impacts black people" is that, while ostensibly close to the truth, it is very misleading. All of a sudden you don't care as much about improving economic conditions and opportunities, or dealing with jury bias, but are distracted by things like implicit bias/racism training/caring about what the media will say in reaction to an arrest, etc.

I consider this a subtle use of the noncentral fallacy, which I know you hate.

  1. When taking notes during a debate, and responding to everything the other person, the other person rarely responds to all of my points. Instead of wiping my entire board, I should remind the person (and the audience) that they failed to respond to the points I've made.
  2. Need to call out people more who live in stories or anecdotes or unrelated examples/data.
  3. Restate when a person doesn't answer a question and move on.
    • Restate what the other person said and ask them if that was an answer to your question.