[HN Gopher] Mental Liquidity
___________________________________________________________________
 
Mental Liquidity
 
Author : bkohlmann
Score  : 159 points
Date   : 2023-06-11 12:31 UTC (10 hours ago)
 
web link (collabfund.com)
w3m dump (collabfund.com)
 
| hammock wrote:
| Of course we must reference Berlins fable the Fox and the
| Hedgehog here.
| 
| A great essay in this area is Venk's Cactus and the Weasel.
| https://www.ribbonfarm.com/2014/02/20/the-cactus-and-the-wea...
 
| yowlingcat wrote:
| In my experience, This attribute is an absolutely critical part
| of successfully building culture at an early stage startup, and
| you have to be ruthless about culling those who are not willing
| to give it a try nevermind master it.
 
| conradev wrote:
| > Most fields have lots of rules, theories, ideas, and hunches.
| But laws - things that are unimpeachable and cannot ever change -
| are extremely rare.
| 
| This sounds like a rehash of Popperian epistemology. We should
| look forward to disproving existing theories (finding new
| problems), because it leads to new, better theories.
 
  | [deleted]
 
| lcuff wrote:
| This article matches my own life experience: Rather than what
| have you changed your mind about in the past decade, I use 'in
| your whole life'. Speaking personally, there are only two big
| things I've changed my mind about. I'm working on a third... I
| wish the article had included something in the vein expressed by
| Charlie Munger, which is a 'how-to' for intellectual integrity.
| 
| "I never allow myself to have an opinion on anything that I don't
| know the other side's argument better than they do."
 
| lostdog wrote:
| What kind of deliberate practice can help if morove your mental
| liquidity?
 
| krm01 wrote:
| One of the biggest beliefs I keep struggling with is the need to
| be perfect. I've been jamming away for many many weekends on a
| side project that literally was done. I just kept adding tiny
| tweaks left and right, until I literally just now launched it
| (https://amee.la).
| 
| Nothing ground breaking, and in the end nothing that needed to
| have so much perfectionism around.
| 
| The belief of having to need something perfect is one of the
| strongest I see among founders here on HN and elsewhere. It's
| almost always bad. I have zero examples where that ended up being
| good. Yet, even though the facts are clear, it's extremely hard
| to overcome.
 
  | 2h wrote:
  | FYI the whole "lets type out some text" style is really, REALLY
  | annoying. please just give the plain, non animated text on the
  | screen. use whatever fonts or colors you want, BUT DON'T make
  | the text type itself or jump around the screen.
 
  | nicbou wrote:
  | There's no kill like overkill. I've been overdoing a project
  | for the last 5 years and I thoroughly enjoy it. The site is
  | live and pays my bills so why not?
  | 
  | That being said, this comment feels more like self-promotion
  | than conversation. Don't do that.
 
  | david_allison wrote:
  | Unsolicited feedback:
  | 
  | * Your input box doesn't look like a text box
  | 
  | * The 'enter' key doesn't work in the text box
  | 
  | * 'Refresh' neither looks like a refresh icon, nor has a label
  | 
  | * The fade on the right of the gallery implies you can scroll,
  | but this isn't possible
  | 
  | * The generated logo + icon pair wasn't immediately noticeable
  | (the first image is the icon without text, and the first icon
  | isn't guaranteed to be noticeable), possibly generate image
  | with text + logo on a transparent background and put it above
  | the 4 sample images.
 
    | cassepipe wrote:
    | Haha, that's a bit of a cruel response to someone who just
    | wrote he had a "perfection" problem. You probably had OP to
    | waste the entire week now. WebDesign is a total time sink
    | 
    | On desktop, you can actually scroll right now
 
      | motoxpro wrote:
      | Haha totally. I had to keep in mind that my perfect is
      | someone else's average.
 
      | Gaessaki wrote:
      | This is good feedback though, as I had the same issues. It
      | shows the value of launching and iterating quickly with
      | user feedback, rather than building in the dark in the
      | guise of perfection.
 
    | interlinked wrote:
    | https://www.youtube.com/watch?v=opbF9Nz_Emg
 
  | detourdog wrote:
  | If you actually shipped the extra weeks was probably
  | worthwhile. My side project effort can be measured in decades
  | with little real progress yet. I really think this year might
  | see some movement.
 
  | xwowsersx wrote:
  | Yeah, we are all susceptible to this. The tweaks were not worth
  | the time because they didn't move the needle on the core
  | offering. At the end of the day, this succeeds or fails based
  | on how good the logos are. In my few minutes of trying this
  | out, the generated logos were random, seemingly unrelated to
  | the names, and just generally very unoriginal and low quality.
  | I don't want to sound discouraging because this is a cool
  | project, but just to say that spending time perfecting pixels
  | and whatever else that doesn't have to do with the underlying
  | functionality is probably not time well spent at this point.
 
  | usefulcat wrote:
  | When I do this, rather than thinking of it as some kind of
  | mistake or failing, I think of it as an experiment or learning
  | experience.
 
  | ben_vueJS wrote:
  | This has been a mental barrier for me as well. I'm not sure if
  | it's in the realm of belief or rather fear of failure.
  | Personally more inclined to say it's the latter.
 
    | haswell wrote:
    | I'd argue that the fear of failure still boils down to
    | underlying beliefs about:
    | 
    | - What it actually means to fail
    | 
    | - That failure is inherently bad
    | 
    | - What will happen next after failure occurs
    | 
    | - What it says about me when fail
    | 
    | - What others will think about me when I fail
    | 
    | - That I can't recover from failure
    | 
    | etc.
    | 
    | If you grow up hearing that failure is bad/wrong/implies
    | something about you as a person, it might never occur to you
    | that another framing is that life is a series of experiments,
    | and failure can be one of the best ways to zero in on success
    | (in some cases, this may be the _only_ possible way).
    | 
    | As far as I can tell, it's beliefs all the way down, and
    | adjusting certain beliefs can fundamentally transform
    | experience relative to all downstream implications of that
    | belief.
 
      | nathants wrote:
      | the best attitudes are an acquired taste. losing is fun!
 
  | mattgreenrocks wrote:
  | Am partially convinced that overexposure to the comment section
  | can encourage perfectionism in those that are already disposed.
  | 
  | And the comment section is rarely a representative sample of
  | your target audience.
 
  | deepzn wrote:
  | Reminds of reading what Reid hoffman said...
  | https://twitter.com/reidhoffman/status/847142924240379904?la...
 
  | moneywoes wrote:
  | Is this project just a funnel for your SaaS as a graphic
  | design? Reminds me of that Twitter user who popularized that
  | model.
  | 
  | If so does it matter if it's perfect when yore goal is just to
  | boost top of funnel for the agency?
 
  | ianbutler wrote:
  | I like this, quick suggestion, I'd add an ability to take one
  | of the generated logos and refine from that same logo.
 
| wwweston wrote:
| > A question I love to ask people is, "What have you changed your
| mind about in the last decade?" I use "decade" because it pushes
| you into thinking about big things, not who you think will win
| the Super Bowl.
| 
| This is a great question. And "decade" is a good time frame not
| only because of size but because it's a long enough time frame
| there's a better chance people will have good answers.
| 
| The Dee Hock quotes ("A belief is not dangerous until it turns
| absolute" and "We are built with an almost infinite capacity to
| believe things because the beliefs are advantageous for us to
| hold, rather than because they are even remotely related to the
| truth") are great too.
 
| turnsout wrote:
| Mental Liquidity is another way of thinking about "Psychological
| Flexibility," which is the subject of a huge amount of clinical
| research. There's an entire therapeutic framework called
| Acceptance and Commitment Therapy (ACT) which came out of this
| research.
| 
| Check out this article [0] for a description of ACT from a
| founder's perspective.
| 
| [0] https://every.to/no-small-plans/how-to-do-hard-things
 
| SnowHill9902 wrote:
| Stay humble.
 
| jrflowers wrote:
| > _Be careful what beliefs you let become part of your identity._
| 
| "I have a tight enough knowledge and grasp of my beliefs to
| intentionally control my sense of identity" is a fascinating
| belief to turn into an identity.
 
  | neerajsi wrote:
  | This might be relevant: https://en.m.wikipedia.org/wiki/Self-
  | authorship
  | 
  | According to the Kegan theory, it's possible. I'd be fascinated
  | to see it if anyone knows of a study that demonstrates self
  | authorship in a population of real people.
 
    | jrflowers wrote:
    | I've yet to hear anything akin to self authorship from
    | someone that didn't have a book/seminar/consultancy to sell.
    | 
    | It's a somewhat amusing thought that there is this human
    | phenomenon wherein we can transcend nature, nurture, the id,
    | the ego, the superego, biology and chemistry -- and
    | overwhelmingly those that achieve this enlightened state
    | coincidentally tend to end up as self-help bloggers and
    | motivational speakers.
 
| skilled wrote:
| I think this article is a little too overzealous with trying to
| simplify a topic like beliefs and ideas.
| 
| A lot of it also sounds like common sense to me, the people
| capable of grasping this:
| 
| > Be careful what beliefs you let become part of your identity.
| 
| Are quite capable of adjusting themselves.
| 
| Everything else falls into either Ego, or people being
| self-(un)aware, and for the latter - you can only change "their"
| belief system if they themselves are willing to change.
 
| ChrisMarshallNY wrote:
| I have had to have an open mind.
| 
| Long story. Lots of tears. Get your hanky.
| 
| It's served me well, in my technical work.
| 
| I now do a lot of stuff that I used to scoff at.
 
| jonasenordin wrote:
| Brings to mind Robert Pirsig's 'value rigidity' concept: 'an
| inability to revalue what one sees because of commitment to
| previous values.' I don't remember if there was a term for the
| opposite, but 'flexibility' seems to be right.
 
| Borrible wrote:
| Take a die with six to twenty sides and assign a belief
| system/worldview to each number. Roll the dice twice, first for
| the belief system/worldview, the other for the number of months
| you live by it. Of course, you can vary the parameter according
| to your taste and courage. But it is important to persevere, so
| you better start small. I call it Rhinehartian chaotic paradigm
| shift. Dice Man goes chaos magic.
 
| moneywoes wrote:
| AI taking away jobs is one. Previously though more jobs would be
| created but now my beliefs have fundamentally shifted
 
| wayeq wrote:
| > Albert Einstein hated the idea of quantum physics.
| 
| Einstein came up with most of what physicists now recognize as
| the essential features of quantum physics. He was not anti
| quantum, he just believed randomness could not be a fundamental
| feature of nature.
 
  | bsder wrote:
  | Einstein also had a bunch of real, substantial objections.
  | 
  | One of the big ones had to do with whether the "fields"
  | formulation was valid and primary. One of the issues is that if
  | you follow the fields formulations that Einstein believed in
  | out to conclusion you get things like "atomic oribtals never
  | decay".
  | 
  | Which, of course, is obviously wrong. And an example of one of
  | the reasons why Bohr is considered to have won his debates with
  | Einstein.
  | 
  |  _Except_
  | 
  | Einstein was right! We now know that when you isolate an atom,
  | it's atomic orbital decay gets slower and slower the more you
  | isolate it.
  | 
  | The problem at the time was that all of the experiments that
  | could be run were statistical aggregations and obscured the
  | nature of single state quantum systems.
 
  | lo_zamoyski wrote:
  | N.b.. Rob Koon's book[0] may be of interest to some of the more
  | philosophically inclined. He argues that the proper
  | interpretation of QM is in light of hylomorphic dualism.
  | 
  | [0] https://a.co/d/6eq227u
 
| javajosh wrote:
| The prerequisite for "mental liquidity" is articulated by
| Aristotle: "It is the mark of an educated mind to be able to
| entertain a thought without accepting it." If you entertain the
| thought, this gives you the chance to try out a new belief
| network. If you find your belief network would be strengthened by
| its inclusion, then you adopt it. Otherwise, you reject it. In
| this way, ones interconnected set of beliefs grows monotonically
| stronger. And this is right and good.
| 
| EDIT: got downvoted! I would love love love to know why! Not
| offended, just curious.
 
  | rgrieselhuber wrote:
  | [flagged]
 
    | dang wrote:
    | Your experiment has a confounding variable--it broke the
    | penultimate guideline.
    | 
    | https://news.ycombinator.com/newsguidelines.html
 
    | javajosh wrote:
    | That seems a bit trite. I was thinking it was a misclick, but
    | hopefully whoever did it will chime in, and they will NOT
    | themselves get down-voted, whatever their reason.
 
  | MrPatan wrote:
  | The mind that doesn't want to risk entertaining "wrong" truths
  | can't stand being reminded of the fact.
 
  | Etrnl_President wrote:
  | "One can not learn what one thinks one already knows"
  | --Epictetus
 
  | kubanczyk wrote:
  | You cannot say _belief_. Say _hypothesis_ while leaving rest of
  | your argument intact. If you value that kind of score in your
  | life.
 
    | javajosh wrote:
    | What is a belief it not the highest-ranked hypothesis of all
    | possible options? Obviously beliefs are more complex than
    | that, since we have a default set installed in us as
    | children, and only a subset of humans are taught the rational
    | methods of improving those beliefs over time. I consider
    | myself a member of that subset.
    | 
    | (Quoting Aristotle always puts me in the mood to rank
    | things.)
 
      | theredfury wrote:
      | I do believe a hypothesis to be different than a belief. A
      | belief performs a different function than a hypothesis.
      | 
      | A hypothesis can be defined as a "proposition made as a
      | basis for reasoning, without any assumption of its truth"
      | (Oxford Languages definition). Typically a function you
      | perform to unearth a truth.
      | 
      | A belief on the other hand holds some position on the
      | spectrum of truth. To believe is to make an assertion about
      | truth. A hypothesis is somewhat of a precursor to that.
      | 
      | But hey, regardless of our stance on the definitions of
      | these words, I heavily jive with the idea that we should
      | improve our beliefs over time and I have mad respect for
      | Aristotle.
 
        | javajosh wrote:
        | I don't think we disagree. A hypothesis is upgraded to
        | "belief" and therefore to the "spectrum of truth" only
        | because it's the best you know of, not because it's the
        | only one. It's a matter of degree, not kind. And a
        | belief's position as the best one is always precarious;
        | it can be unseated at any time by a better hypothesis.
        | 
        | Axioms are different, but over time I've found that even
        | those weaken and become "merely" strong beliefs (or, more
        | usually, only True within the context you're working in,
        | e.g. mathematics). Even "I think therefore I am" is not
        | axiomatic, I have come to believe. In fact I doubt it's
        | important to identify some sort of root cause, which is
        | rationalist heresy. Oh well.
 
  | saiya-jin wrote:
  | > And this is right and good.
  | 
  | I disagree (without down-voting). This is basically 1-man echo
  | chamber, you take what you like (it doesn't matter how many
  | eloquent words you use to describe this, result is same),
  | reject what would challenge your beliefs and would make them
  | weaker. That's the opposite of critical thinking so needed in
  | real world, and prime source why the current world,
  | particularly west, is so torn to pieces about shit like russia,
  | trump, guns, migrants and so on.
  | 
  | Stuff in life is complex, always, almost at fractal level. You
  | keep learning, if you actually want, about new viewpoints that
  | will challenge your current ones, every effin' day. Maybe at
  | the end conclusion is don't trust anybody, people are generally
  | a-holes etc. That's still fine as long as it represents truth.
 
    | javajosh wrote:
    | It seems like you misread what I was saying. I am not
    | advocating a "1-man echo chamber" - that would be a person
    | who never changes their beliefs. When I say "weaker" and
    | "stronger" I am referring to the whole of the belief network,
    | not individual beliefs. This means, generally, that every
    | change reduces inconsistency and increases cohesion _of the
    | entire network_. The ignorant people in the world pay no
    | attention to consistency, only to feeling, which makes their
    | network intrinsically weak, and they become emotional and
    | ultimately resort to violence rather than resolve to improve
    | their beliefs. (The internet makes this kind of interaction
    | more common, even encouraged, since it drives  "engagement" -
    | one of the great tragedies of our time.)
    | 
    | Stuff in life is complex, people are assholes, but even
    | assholes have good ideas sometimes. I recommend listening to
    | everyone who speaks for themselves in good faith. Anyone can
    | cook!
 
      | throwaway14356 wrote:
      | >This means, generally, that every change reduces
      | inconsistency and increases cohesion of the entire network
      | 
      | This is analog to growing the tree, the page talks about
      | cutting it down.
      | 
      | One could give many examples but the good ones are unlikely
      | to resonate with others.
      | 
      | To give a poor one. There was a time when I understood
      | human decision making as a hierarchy of people in
      | increasingly greater positions of power with access to
      | better information and to people with greater skill. Then
      | one day it struck me how they too are just going though the
      | motions with their freedom for creativity limited to a
      | single potentially career ending move. The machine happily
      | grinds on without anyone behind the wheel.
 
      | kbenson wrote:
      | I think you're making assumptions about people and their
      | capability to judge consistency over large chunks of
      | information, when that information is at least internally
      | consistent and common in their experience.
      | 
      | If I believe the Clinton's are pedophiles and murderers and
      | are part of a ring of like minded people, and I'm inundated
      | with information from people and organizations which
      | support this (or at least carefully don't refute it), then
      | when I'm presented with information about a pizza parlor
      | that is supposedly holding children in the basement, is
      | that consistent with my beliefs?
      | 
      | I think what you're presenting is just what everyone
      | already does. Instead of assessing thi gs based on how well
      | they fit our beliefs, we should assess them based on a
      | consistent objective standard, and then alter our beliefs
      | if it meets that standard but conflicts with our beliefs.
      | 
      | This may in fact be what you belief, because you belive in
      | facts and the importance of the truth. The problem is that
      | you get wildly different results when someone that values
      | different things applies the same system.
 
    | testacct22 wrote:
    | > reject what would challenge your beliefs and would make
    | them weaker.
    | 
    | Most unresolved disagreements I know of are because the
    | groups disagree on some unprovable underlying assumption.
    | Switching positions on it doesn't make the beliefs stronger
    | or weaker
    | 
    | Being able to believe something and stick to it, regardless
    | of challenges from competing interests or forms of coercion:
    | that's more valuable in practice than being more
    | reconciliatory
 
      | TeMPOraL wrote:
      | > _Most unresolved disagreements I know of are because the
      | groups disagree on some unprovable underlying assumption.
      | Switching positions on it doesn 't make the beliefs
      | stronger or weaker, just different_
      | 
      | In my experience, that assumption isn't in principle
      | "unprovable" - the parties to the disagreement usually
      | don't realize they're making such assumption in the first
      | place! Switching positions can make the existence of that
      | assumption apparent to all, and if people involved are
      | intellectually honest and discussing in good faith, it's
      | pretty much impossible for their disagreement to remain as
      | strong as it was before.
      | 
      | > _Personally, I prefer having convictions and sticking to
      | them._
      | 
      | Good point about competing interests and "reducing to
      | something manageable". I prefer "strong opinions weakly
      | held", but in practice, I embrace the natural _inertia_ of
      | beliefs. I.e. I don 't consider me already believing
      | something to be strong evidence the belief is true (i.e.
      | "having convictions") - but the stronger a belief is, or
      | more high-impact changing it would be (e.g. suddenly
      | feeling a moral compulsion to upend my entire life), the
      | more evidence _and time_ I need to change my mind.
      | 
      | This may be also a dumber and less admirable strategy, but
      | it's effectively a low-pass filter on evidence: it saves me
      | from changing my mind every other day, and suffering the
      | costs (including cognitive dissonance if I plain override
      | my beliefs for sake of quality of life).
 
      | throwaway14356 wrote:
      | Binary people are funny. Everything is always 1 or 0,
      | nothing is ever undefined and the idea to have different
      | levels of certainty never occurs to them.
      | 
      | It is a rather offensive way to portray the world. All the
      | questions, all the puzzles, all the mystery, everything has
      | been answered and further investigation frowned upon. They
      | would have to _again_ defend their chosen  "truth", they
      | would have to question everything!
 
  | byteware wrote:
  | I am curious how one would measure if their "belief network
  | would be strengthened"
 
  | ABCLAW wrote:
  | There's a common issue in philosophy and epistemology over how
  | we come to know things. We wanted to know what 'knowledge' was,
  | and settled upon the concept of a 'justified true belief' for a
  | fairly long period of time.
  | 
  | However, one day, a philosopher found a situation in which a
  | justified true belief was incorrect. This is the Gettier
  | problem.
  | 
  | What you describe is something akin to a network of baysian
  | conditionals attached to certain proposition, which upon
  | confrontation with new information update their relative
  | weights. We know with certainty that this process has
  | significant benefits in general (it's certainly better than
  | most systems not internalizing new information), but can and
  | does create false reasoning.
  | 
  | In short, it's good but not sufficient to create knowledge. The
  | problem of individuals creating ideological filter bubbles
  | around themselves is very related to the idea that their
  | evidentiary priors become more and more rigid as they note
  | confirmatory evidence over time that justifies their views over
  | time. The issue isn't that they stop intaking new information,
  | but that their priors and the new information are interpreted
  | based upon that belief network.
  | 
  | Thankfully, as a super-organism, we have a great solution for
  | that mental ossification. We die. New people who have less
  | evidentiary accumulation can address the issue with new priors
  | and often that's all that's needed for huge breakthroughs.
 
    | neerajsi wrote:
    | Thanks for your point about death.
    | 
    | Death forces our species wide belief set to go through the
    | constrained channel of education and communication, the same
    | way that our bodily attributes go through the constrained
    | channel of our germ-line genes.
    | 
    | This process lossily compresses the signals, which allows for
    | drift or attenuation when the next generation reconstructs
    | the beliefs and associated behaviors. Transmission also
    | applies stress that acts as a filter to weed out beliefs that
    | are no longer adaptive.
 
    | lo_zamoyski wrote:
    | The Gettier problem is overrated.
    | 
    | The question is "what _is_ knowledge? ", not "do we know
    | _that_ we know _p_? ". And I see no issue with the definition
    | of knowledge as justified, _true_ belief. Now, if I believe
    | _p_ , and you ask me whether I know _p_ , I may say yes. But
    | whether I actually know _p_ will depend on whether my
    | justification is valid (that it really is a justification and
    | a sufficient one) and whether it is _true_ , which has
    | nothing to do with whether anyone _knows_ whether the
    | justification is valid and the belief is true. It 's a
    | separate question, and conflating the two questions leads to
    | an infinite regress of skepticism. So the definition of
    | knowledge qua knowledge still stands.
    | 
    | I would also suggest you try to apply your general approach
    | to the very theory you are proposing. I see an opportunity
    | for retorsion arguments.
 
| Invictus0 wrote:
| Mental flexibility would be a better term but of course finance
| people rarely perceive much outside their own bubble.
 
| trentnix wrote:
| I've heard it also said "strong opinions, weakly held". Unlike
| "mental liquidity", it doesn't require explanation.
 
  | zug_zug wrote:
  | I disagree, I just heard this phrase from you the first time
  | just now, and I don't think it's self-explanatory.
  | 
  | It's unclear to me in what respect the opinions are "strong" if
  | not one's conviction in them. To my mind a strong opinion is an
  | opinion one is confident in.
  | 
  | Also it's unclear to me if/how/why this is better than "less
  | opinions". Like is it better to have a "strong opinion weakly
  | held" on topic X versus "My opinion is pending scientific
  | research will answer this"?
  | 
  | A nitpick -- I actually have a pretty big distaste for maxims
  | that have some cutesy rhyming/wordplay to them (in this case
  | it's X y, !X z, X = strong).
 
    | dasil003 wrote:
    | I agree it's not self-explanatory. All such pithy statements
    | are only insightful based on hard-won experience behind them
    | --the map is not the territory, after all ;)
    | 
    | As far as "strong opinions, weakly held", this is one of my
    | favorites at work in a large scale product engineering
    | environment. It goes beyond "mental liquidity" as described
    | in the OA (which is really just about the "weakly held"
    | part). The "strong opinions" part is that often times groups
    | will succumb to analysis paralysis or unwillingness to make a
    | decision due to group dynamics. Having a strong opinion
    | (ideally backed by knowledge and expertise) is a way to push
    | through and bring clarity. The risk is there is a personality
    | type prone to blustering overconfidence that will push a
    | group in a certain direction without reasonable
    | justification. Ideally what you want is a critical mass of
    | smart, decisive, but open-minded people who are quick to
    | assimilate new evidence into their viewpoint.
 
    | dgs_sgd wrote:
    | > To my mind a strong opinion is an opinion one is confident
    | in.
    | 
    | That's the correct interpretation of "strong opinions", as I
    | understand the phrase.
    | 
    | The "weakly held" part means that you are willing to adjust
    | your opinion in the face of contradictory evidence, which is
    | difficult to do for deeply held beliefs.
 
    | chefandy wrote:
    | I'm not interested in delving into pedantry, so I'll stop
    | after this. My intuitive understanding of this phrase is that
    | strong or weak opinions are generally a measure of magnitude
    | more than stability, while strongly or weakly _held_ opinions
    | are a matter of stability rather than magnitude. Someone
    | might have a milder opinion of something, like  "Pepperoni
    | pizza is fine" vs. a stronger stance, such as "Pepperoni
    | pizza is the BEST pizza." How easily that opinion is changed
    | does not necessarily correlate. Perhaps the person who thinks
    | pepperoni pizza is the best has never tried salami pizza and
    | will be an instant convert. Maybe they're the worlds BIGGEST
    | pepperoni fan. Maybe the person with the weaker opinion on
    | pepperoni might be very very unlikely to change it because
    | they don't care enough about pizza in general to consider it
    | much. Maybe they love pizza, but are one bite of pepperoni
    | pizza away from saying "bleh, hand me a slice of mushroom."
 
| mo_42 wrote:
| I like this nice little text. Einstein is a perfect example for
| mental liquidity. I think we should be very forgiving about this
| for two reasons: first, Einstein was one of the people
| establishing quantum mechanics. He also got the Nobel Price for
| his work on the photoelectric effect. Second, even the brightest
| minds have only a narrow time frame until mental ability starts
| to decline. So we cannot expect a brain to dig deep into general
| relativity and at the same time something completely different
| like QM. Surprisingly, Einstein even contributed to QM in old age
| by trying to poke holes into the theory that later proved to be
| true (e.g., spooky effects at a distance).
 
| golemotron wrote:
| It seems like this is a term for the ability to avoid sunk cost
| fallacy ( https://www.scribbr.com/fallacies/sunk-cost-fallacy/ )
| 
| The link contains a number of reasons why people get trapped in
| sunk cost fallacy.
 
| technological wrote:
| I think it is hard to change your beliefs is because of the
| discomfort that is associated with that change.
 
| HellDunkel wrote:
| I just finished the Einstein biography by Walter I. and found
| Einsteins stubbornness quite entertaining. He knew about this
| trait, accepted it as an effect of ageing and even was making
| jokes about it. He simply disliked some facts about quantum m.
| and allowed himself to pursue a rather fruitless endeavour for
| many years. He knew that this kind of stubbornness would kill the
| career of a younger scientist but he could afford to do so. In
| that sense he contributed to science.
 
  | neerajsi wrote:
  | You're quite right. Science requires the skepticism to apply
  | the stress to theories needed to make them strong. I'm assuming
  | Einstein tried to raise objections using evidence to the
  | contrary and alternative explanations.
 
    | n4r9 wrote:
    | Along with Podolsky and Rosen he formulated one of the
    | original quantum thought experiments to challenge the
    | accepted conventions: https://en.m.wikipedia.org/wiki/Einstei
    | n%E2%80%93Podolsky%E2...
    | 
    | This in turn inspired Bell's theorem, and eventually quantum
    | information theory.
 
| zone411 wrote:
| "Mental fluidity," "mental flexibility," or "cognitive
| flexibility" seem like better terms.
 
  | layer8 wrote:
  | Creedoplasticity? Pisteuoplasticity?
 
  | gms7777 wrote:
  | or "mental/cognitive plasticity"
 
    | nmstoker wrote:
    | My thoughts exactly: this is generally referred to in the
    | literature as mental/brain plasticity.
    | 
    | Coining a new term when a perfectly good one exists is
    | unfortunate but happens, as see with the author here.
    | 
    | Edit: here's a link to neuroplasticity (aka brain
    | plasticity):
    | 
    | https://en.m.wikipedia.org/wiki/Neuroplasticity
 
| cortesoft wrote:
| > It might sound crazy, but I think a good rule of thumb is that
| your strongest convictions have the highest chance of being wrong
| or incomplete, if only because they are the hardest beliefs to
| challenge, update, and abandon when necessary.
| 
| I strongly disagree with this, unless we are only talking about
| beliefs that are about facts of the universe.
| 
| For example, my strongest belief is that all people have an equal
| right to exist and pursue their own purpose... this is not a
| belief about the facts of the universe, but about my own
| morality. I don't think it has a chance to be 'wrong'
 
  | JakkTrent wrote:
  | I believe that knowing and believe are two different things ;)
  | 
  | Belief is far stronger - that's why people do things all the
  | time they themselves at one point "knew" they couldn't do.
  | 
  | If you start with a flawed belief - things won't improve from
  | there. You'll ending "knowing" a whole lot of stuff that
  | reinforces your flawed belief - simply
  | glossing/ignoring/downplaying the facts that don't support...
  | this becomes a bit of feedback loop after awhile.
  | 
  | So either learn to let go of your beliefs and adapt or at least
  | don't firmly establish beliefs until after you know enough
  | stuff to decide for yourself what to believe.
  | 
  | I reevaluate mine all the time and I'm not wrong on of my
  | strong convictions - albeit from my point of view, which I've
  | made as broad as possible but I'm still human.
  | 
  | My highest beliefs today are built upon a foundation of
  | information, learning and mistakes - I may state a belief with
  | a single sentence but I can write books about why I've arrived
  | at that belief.
  | 
  | I don't that's morality - I sometimes do things I "know" to be
  | immoral, when the justification warrants it, I've never
  | knowingly decided to believe something I know is wrong - even
  | if I was forced, I'd only pretend to believe at best.
  | 
  | In college I'd cheat on a test tho if I thought it the only way
  | I'd pass - bc I believed passing was more important than the
  | test... maybe it's a bad example of immorality.
  | 
  | Anyways, I completely agree with Cortesoft - I'm settling on
  | the understanding that all people everywhere are fundamentally
  | important, collectively and individually.
  | 
  | Allowing and empowering all people to live their best lives is
  | in all of our best interest. I've gone further even than equal
  | right to existence and yet I'm supremely confident.
  | 
  | I think this rant also rather effectively demonstrates exactly
  | what the OP was saying about our strongest convictions.
  | 
  | An incorrect fundamental belief - like say I believed the earth
  | was flat, that belief would be implicit in all that I believe
  | after that, just part of my world view and muddling up
  | everything I think about anything - I wouldn't even be aware of
  | that.
  | 
  | Mental liquidity. Fantastic.
  | 
  | Otherwise knowledge can be an immovable trap that becomes
  | harder to avoid/escape the more stuff you know.
  | 
  | Scientists are great examples of this - if it can't be
  | scientifically methoidized it doesn't exist and therefore must
  | be explainable within the framework they already know, bc
  | that's always right ;)
 
| robg wrote:
| It's easy to forget how difficult learning is, for us as
| individuals and as flocks in formation. Pick any topic and it's
| likely it took you years to learn well. So simply switching out
| beliefs embedded in that topic requires overwriting years of
| patterns and synapses in sync.
| 
| Where Kuhn is so helpful in understanding that even scientists
| have immense difficulty, if not vigorous myopia, stuck with wrong
| beliefs. Paradigm shifts with funerals is easier over decades
| than getting scientists to evolve their models.
 
  | MichaelZuo wrote:
  | It's so much so that I would almost define intelligence as the
  | ability to "switch out beliefs".
 
    | lanstin wrote:
    | The geologists that died disbelieving in plate tectonics
    | weren't free of intelligence. The very systems that allow us
    | to find patterns are also liable to get stuck with seeing
    | certain patterns.
    | 
    | Not only is it impossible with current human knowledge to
    | construct an infallible theory that predicts everything we
    | encounter, it is also impossible with current human
    | physiology never to cling to wrong ideas in the face of
    | counter evidence. When examining our rationality, we must not
    | only admit our data are incomplete and our theories flawed,
    | but we ourselves might be thinking foolishly.
 
| sedivy94 wrote:
| A term I've come to like more is "cognitive flexibility".
 
| xyzelement wrote:
| Perhaps unexpectedly, I find that thoughtful engagement with
| religion (Judaism in my case) has helped me become much more
| liquid on other topics.
| 
| When you accept on faith a handful of principles that deal with
| an unknowable domain, it becomes much easier to be less attached
| to the other stuff.
 
  | haswell wrote:
  | I grew up under a toxic form of fundamentalist Christianity
  | that left deep scars and made me pretty allergic to religion.
  | 
  | For me, I've found success and deep value in exploring non-
  | sectarian Buddhist philosophy, which points directly at the
  | problems caused by attachment to ideas and things, and does a
  | good job of deconstructing thought processes that most of us
  | engage in without realizing.
  | 
  | To me, this is less about choosing to accept certain principles
  | on faith as much as it is about recognizing/acknowledging that
  | _this is what we already do_ in most aspects of our lives.
  | 
  | To anyone who can find value in traditional religious
  | contemplation while avoiding the downsides, more power to you.
  | The point of my comment isn't to say there's nothing to be
  | found there, but if the version of religiosity you're familiar
  | with is the toxic kind, there are other paths to follow that
  | get at some arguably important insights without some of the
  | baggage that can be difficult to avoid.
  | 
  | (I realize Buddhism has religious roots, but there is a long
  | history of exploring the underlying insights in a non-religious
  | context e.g. Zen, and the analytical framework associated with
  | traditions like Dzogchen and Vipassana are applicable without
  | any of the metaphysical underpinnings).
 
    | xyzelement wrote:
    | (I am the person you are responding to) I grew up completely
    | ignorant of religion and my first foray into that was the
    | study of yogic tradition. Once I got a taste of what exists,
    | I was very lucky to realize my ancestral faith has incredible
    | depth, beyond that which is even understood by those say they
    | are kinda religious (ie, many people who say they are
    | religion X don't know how much there is to X)
    | 
    | On the toxic part, sorry to hear that. I think anything can
    | be toxic originally to the value of the concept. (ie someone
    | may have a horrible experience with a coach but that doesn't
    | take away from the value of fitness in general) but it sounds
    | like you have a pattern that works well for you.
 
  | wwweston wrote:
  | > I find that thoughtful engagement with religion (Judaism in
  | my case)
  | 
  | I've heard Judaism characterized as very accepting of discourse
  | and reinterpretation of itself. Does this strike you as
  | accurate? If so, it sounds like a kind of mental liquidity...
  | 
  | > When you accept on faith a handful of principles that deal
  | with an unknowable domain
  | 
  | Sounds like mathematics, in which practitioners become used to
  | both the process of relying on a set of axioms and selecting
  | them for the purposes of exploring or constraining systems,
  | which makes one aware that there's a certain degree of choice
  | or even potentially arbitrariness to it...
 
    | xyzelement wrote:
    | I agree with you on both counts.
    | 
    | For example, the study of the Talmud is an example of both
    | mental training in debating an issue from several
    | perspectives, and the installment of the idea that this is
    | part of the religion.
    | 
    | You can also look up "Jewish responsa" on Wikipedia as a
    | diving point into this.
 
  | Mutlut wrote:
  | You just might discovered yourself what others did without
  | thinking: Following some given path to stop worring and using
  | it as 'this can't be wrong because its old and others are doing
  | it and enabling me'.
  | 
  | Perhaps community fits even better.
  | 
  | I personally am free enough to design my own life without
  | boundaris.
 
    | xyzelement wrote:
    | Your current self-description and opinion of religion is
    | where I was prior to moving onto my current state. Looking
    | backwards, going beyond this represented breaking a boundary
    | for me.
    | 
    | I am not trying to persuade you and I am holding back from
    | expounding on what I mean at length here, just sharing the
    | perspective.
 
      | Mutlut wrote:
      | i wouldn't mind your perspective.
      | 
      | I do thought about a lot and its definitly exhausting to be
      | free but i have been a nihilist since 16. Thought through
      | tons of ideas and concepts (what if the universe is
      | repeating itself, no free will, after life, before life,
      | 'the egg' story, lsd, mdma, ...)
      | 
      | I'm now quite happy and content and still curious with my
      | life. Havent' felt better than this and going the next
      | step: getting a farm and transforming my environment how i
      | want it to be.
 
| willtemperley wrote:
| Be a goldfish: Ted Lasso, 2020.
 
  | Etrnl_President wrote:
  | [dead]
 
| photochemsyn wrote:
| One approach to preserving mental fluidity is to not get
| emotionally attached to ideas. This was expressed by Richard
| Feynman in his 1979 lectures on quantum electrodynamics,
| available here:
| 
| http://www.vega.org.uk/video/subseries/8
| 
| > Q: "Do you like the idea that our picture of the world has to
| be based on a calculation which involves probability?"
| 
| > A: "...if I get right down to it, I don't say I like it and I
| don't say I don't like it. I got very highly trained over the
| years to be a scientist and there's a certain way you have to
| look at things. When I give a talk I simplify a little bit, I
| cheat a little bit to make it sound like I don't like it. What I
| mean is it's peculiar. But I never think, this is what I like and
| this is what I don't like, I think this is what it is and this is
| what it isn't. And whether I like it or I don't like it is really
| irrelevant and believe it or not I have extracted it out of my
| mind. I do not even ask myself whether I like it or I don't like
| it because it's a complete irrelevance."
| 
| I think that's critical, because if you become emotionally
| involved with promoting an abstract idea, it becomes part of your
| personal identity or self-image, and then changing your mind
| about it in the face of new evidence becomes very difficult if
| not impossible.
| 
| In another lecture, Feynman also said something about not telling
| Nature how it should behave, as that would be an act of hubris or
| words to that effect, you just have to accept what the evidence
| points to, like it or not.
| 
| (Changing your mind about what's morally acceptable, socially
| taboo, aesthetically pleasing etc. is an entirely different
| subject, science can't really help much with such questions.)
 
| cubefox wrote:
| The best way to test your "mental liquidity" is to think about
| some hypotheses that are outside the "Overton window" or even
| outright taboo.
| 
| "What if ***** were true? Surely it can't be true. If it were,
| that would be terrible."
| 
| That's motivated reasoning. Remember that the truth of any
| hypothesis is not influenced by how much you want it to be true,
| or false. Some hypotheses are deeply uncomfortable, but you
| should nonetheless strive to believe the truth. Or rather, what
| is best supported by the evidence. Even if it hurts.
 
  | noduerme wrote:
  | Actually, most people shouldn't do that in most cases, because
  | they aren't qualified to understand the evidence presented to
  | them. Nor are the hypotheses they're testing their own.
  | Valuable hypotheses arise from evidence - not vice versa. This
  | is why juries in complex cases need so much time to be walked
  | through subject matter by expert witnesses, and why standards
  | of evidence are applied to what they are and aren't allowed to
  | hear, and why the conclusions they may or may not draw are
  | circumscribed to the cases being made by lawyers as allowed by
  | judges. When people search the internet for evidence to support
  | their most uncomfortable hypotheses, they'll always find it.
  | That's how we get masses of people who believe in conspiracy
  | theories and satanic panics, with the certainty of those who
  | incorrectly believe they've done their own "research".
  | 
  | Taking up the most uncomfortable (i.e. "forbidden") hypothesis
  | and giving it the weight required to attempt to prove it to
  | yourself is not a systematic way of finding truth; it's a way
  | of deceiving yourself into believing in the simplistic
  | frameworks of other people's paranoid conspiracy theories.
 
    | cubefox wrote:
    | The above was only a case against wishful thinking and
    | rationalization. Of course expert testimony is still some
    | form of evidence. The point is not to willfully ignore or
    | reinterpret the evidence because you don't like the direction
    | it is pointing at.
    | 
    | It is worth citing the Litany of Gendlin:
    | 
    |  _What is true is already so.
    | 
    | Owning up to it doesn't make it worse.
    | 
    | Not being open about it doesn't make it go away.
    | 
    | And because it's true, it is what is there to be interacted
    | with.
    | 
    | Anything untrue isn't there to be lived.
    | 
    | People can stand what is true,
    | 
    | for they are already enduring it._
 
      | lanstin wrote:
      | What was that old definition of ideology, an unreal
      | relation to real facts?
 
| hcarvalhoalves wrote:
| > Changing your mind is hard because it's easier to fool yourself
| into believing a falsehood than admit a mistake.
| 
| Different angle: it's not simply "fooling" oneself, but it's
| because ideas are one way or another built on top of an
| ideological foundation.
| 
| Einstein rejecting quantum theory on the basis the universe
| shouldn't have a random component to it is also rejecting the
| idea of having to re-examine all philosophy past Descartes and
| Newton, which aligned so well with society's viewpoint at the
| time - a deterministic, cause-consequence universe, where things
| have logical explanations and where hard work is rewarded.
 
| tartakovsky wrote:
| Related: https://medium.com/@ameet/strong-opinions-weakly-held-a-
| fram...
 
___________________________________________________________________
(page generated 2023-06-11 23:00 UTC)