[HN Gopher] The McNamara fallacy: Measurement is not understanding
___________________________________________________________________
 
The McNamara fallacy: Measurement is not understanding
 
Author : wenc
Score  : 174 points
Date   : 2022-02-01 18:02 UTC (4 hours ago)
 
web link (mcnamarafallacy.com)
w3m dump (mcnamarafallacy.com)
 
| [deleted]
 
| serverlessmom wrote:
| "In war, whichever side may call itself the victor, there are no
| winners, but all are losers,"- Neville Chamberlain.
| 
| This is a really interesting look at this type of mindset. I have
| often wondered what it is to "win" a war, and the quote that came
| to mind was the one I posted above. Measurement is certainly not
| understanding in all situations.
 
| sandworm101 wrote:
| What is the name for the fallacy of thinking that wars are always
| meant to be won? Throughout history wars have been fought without
| any intention of "winning". Sometimes a war can serve a religious
| or ceremonial purpose, one that doesn't require a clear winner.
| Other wars have been fought for completely economic reasons.
| Others are proxy contests whereby greater powers can demonstrate
| their abilities without directly engaging each other. The false
| thinking is the assumption that participants always want or even
| care about winning.
 
  | missedthecue wrote:
  | Wars might have unclear objectives and mission-creep, but I
  | don't think anyone is fighting to lose.
 
    | serial_dev wrote:
    | The question is about what "winning" means for the people who
    | start wars and keep fighting them. It might not be the same
    | as for you.
    | 
    | To quote Julian Assange, "The goal is to use Afghanistan to
    | wash money out of the tax bases of the US and Europe through
    | Afghanistan and back into the hands of a transnational
    | security elite. The goal is an endless war, not a successful
    | war".
 
      | missedthecue wrote:
      | I just don't buy that line of thinking at all. If that is
      | truly their goal, there are way easier and subtle ways to
      | do it without causing a collapse in political capital for
      | the parties and politicians involved, which the Vietnam,
      | Iraq, and Afghanistan wars did.
 
    | [deleted]
 
| eumoria wrote:
| McNamara basically admits this himself in the documentary The Fog
| of War:
| 
| https://en.wikipedia.org/wiki/The_Fog_of_War
| 
| It's worth a watch but it's very soft on him and his role. Still
| a very good documentary.
 
  | ProAm wrote:
  | Great documentary. I felt it was a person trying to come clean
  | and ease his conscious on his death bed.
 
    | 2OEH8eoCRo0 wrote:
    | Agreed. Great documentary. The part where Castro said he
    | urged Russia to launch their nukes from Cuba to the US
    | knowing it would destroy Cuba was chilling. Humans are not
    | always logical. Don't assume somebody wont drag an entire
    | country or the world to total destruction for some deranged
    | cause.
    | 
    | I didn't get the death bed vibe from McNamara but I
    | definitely felt that he was genuinely reflecting on the past.
    | 
    | The documentary on Rumsfeld was the polar opposite. I could
    | also see Rumsfeld not wanting to give the enemy of an ongoing
    | conflict any shred of material. It makes for a less
    | interesting documentary.
 
      | nickdothutton wrote:
      | Morris himself said something like he didnt really feel he
      | got to know Rumsfeld and had no real idea what was in his
      | mind compared to his previous interview with McNamara.
 
        | wayoutthere wrote:
        | Guess Donald himself was one of the "unknown unknowns"
 
  | smaug7 wrote:
  | Was looking for this comment. In McNamara's reflection, a
  | tenant he called out was to understand the enemy. US didn't
  | understand the Viet Congs motivation for fighting the war
  | (freedom from colonizers) whereas the US viewed the war as a
  | larger Cold War. This is the same as what happened in
  | Afghanistan that we didn't learn.
 
    | calyth2018 wrote:
    | I'd argue the US has not tried to understand the enemy since
    | the Cuban missile crisis. There are more failures outside of
    | Afghanistan, and I think the US is going to walk right into
    | another one.
 
  | calyth2018 wrote:
  | I haven't live in that era, so maybe it's not my place for me
  | to say whether Morris was particularly soft on him. I think the
  | fact that he said that it was the president's responsibility,
  | that revealed a lot about him.
  | 
  | On the other hand, given the Fog of War did play back a
  | recording where Johnson had a much different view on Vietnam
  | than JFK, I wouldn't put the burden solely on McNamara's
  | shoulders either.
  | 
  | Regardless of what one might think of his role, it was still
  | quite enlightening, and I think more people should watch it. I
  | think the lessons outlined in it are useful, but too few have
  | taken heed of it.
 
  | jkingsbery wrote:
  | I haven't seen that one, but I've been watching the Ken Burns
  | documentary recently. It seems suitably fair. Where maybe some
  | of the proposed "McNamara Fallacy" breaks down from OP,
  | according to archives that they go through in the documentary
  | he knew his approach wasn't working for a long time, he just
  | did not (or would not, or could not, depending on your
  | perspective) say so publicly and didn't seem to have any other
  | way to measure progress.
 
| johnp271 wrote:
| Does the McNamara Fallacy have any application to our response to
| COVID? I often hear pundits of all sorts, medical doctors,
| epidemiologists, politicians, CDC scientists, etc, make
| statements such as "the data shows this" and "the data says that"
| and then follow up with "therefore the science says we must all
| do such-and-such". I hold a rather narrow, rigorous - maybe
| closed minded - opinion of what is 'science' (so to me 'social
| science' is an oxymoron) thus I have a degree of skepticism when
| data analysis is relied on to heavily for making conclusions that
| are then called 'scientific'.
 
  | p_l wrote:
  | More like the data about COVID is used as fig leaf for metrics
  | in other areas actually driving the decisions, or
  | ideology/dogma.
  | 
  | If you go with hard data and experience, you'd do hard moves
  | like China (and many other asian countries did). In fact,
  | similar moves have been done in the past in Europe (on the
  | communist side of Iron Curtain) to stop epidemics, including
  | even manual contact tracing.
  | 
  | But because a non-trivial force in decision making has strong
  | other incentives, and because of dogma like disbelief in
  | aerosol transmission, we end up with really bad decisions with
  | fig leaf of data analysis.
 
    | calyth2018 wrote:
    | > because of dogma like disbelief in aerosol transmission
    | 
    | It's not just that COVID can spread via aerosol. It's more so
    | that the western world extrapolated the definition via a
    | study on TB, neglecting that TB needed to infect deep in the
    | lungs.
    | 
    | https://www.wired.com/story/the-teeny-tiny-scientific-
    | screwu...
    | 
    | There is a scientific paper version of this, btw.
 
| jscode wrote:
| Quick story: I was the CFO for a company that sold to a private
| equity group (PEG). I took over as the CEO as the founders
| retired, leaving me to deal with the PEG. It quickly became
| apparent that the PEG managers looked at everything through the
| lens of an Excel spreadsheet. These guys were brilliant attorneys
| and analysts but lacked experience building businesses and
| managing teams. Ultimately, they couldn't add much value in terms
| of operations or strategy, but they were great at financial
| modeling/quantitative analysis and forcing us to justify
| expenses. That may sound good at first--eliminating wasteful
| spending--but it ultimately led to the gradual erosion of the
| company culture and employee loyalty. It's easy to cut benefits
| and pay given that many workers lack the leverage to do anything
| about it, while it's much harder to reduce hard costs like
| materials and equipment. That meant employees just kept getting
| squeezed, and it was surprisingly difficult to quantify the
| impact that terminating an employee or cutting benefits would
| have on morale/culture/performance.
| 
| The moral of the story is that people with analyst mindsets play
| an essential role in our economy, but sometimes giving those
| people power over large organizations can have disastrous
| consequences. There truly is a disconnect between measurement and
| understanding.
 
  | boringg wrote:
  | While your experience sounds painful it also sounds like
  | something that would have happened in the 90s. I don't think
  | most tier 1 organizations still think in that way.
 
    | apohn wrote:
    | Less than a decade ago I worked at a company that was
    | acquired by a private equity group. What jscode said matches
    | my experience. For a while I also tracked (on Glassdoor and
    | some other sites) companies that were purchased by that firm,
    | and seems like employees at different companies had the same
    | experience. EBITA was king, nothing else matters.
 
    | bb88 wrote:
    | The company exists for the stockholders, not for the
    | employees.
    | 
    | Stack ranking is still used widely throughout Fortune 500
    | companies, which is one of the most culture destroying
    | management practices known to man.
 
  | milesvp wrote:
  | I worked at a company that started to go through the "you can't
  | improve what you don't measure" phase. In general it was good
  | for the org I was in, but I used to have to remind management
  | that there's a corollary to that saying, which is: you
  | necessarily improve things you measure at the expense of the
  | things which are difficult or impossible to measure.
  | 
  | This seems to be a hard one for some types to truly grok. A
  | common response is that we need to figure out how to measure
  | it, thinking there was some single magic number that things
  | could be distilled down to. But often even if you figured out
  | how to measure some of them, there's always other intangibles
  | you're not tracking. So you need to always be conscious of it.
 
  | hn_version_0023 wrote:
  | I'm not the type to pass up making a Star Wars reference, so
  | here goes:
  | 
  | "One would think you Jedi would understand the difference
  | between _knowledge_ and... heh heh... _wisdom_ "
 
  | apohn wrote:
  | I worked at a company that went through a private equity
  | acquisition and I have a question you might be able to answer.
  | 
  | If you exclude layoffs and incentivized retirement, it seemed
  | to be that a greater percentage of individual contributors left
  | as compared to managers. Lots of managers stuck around for 12+
  | months, and it seemed like the percentage of managers of
  | managers (e.g. directors, VPs) who stuck around was even
  | greater. Almost the entire C-suite stayed for years after the
  | acquision.
  | 
  | Was there any financial or other incentives given to managers
  | to stay? As an individual contributor, the morale was just
  | terrible. I just couldn't understand why the managers and other
  | people in leadership positions stuck around.
 
    | jscode wrote:
    | > I just couldn't understand why the managers and other
    | people in leadership positions stuck around.
    | 
    | Money. Investors typically carve out equity to retain key
    | personnel (i.e., management units/stock). The units are
    | worthless unless the company appreciates in value, so
    | management becomes laser-focused on doing whatever it takes
    | to increase the company's valuation. Everything else becomes
    | a secondary concern.
 
  | mrxd wrote:
  | Just to play devil's advocate, surely their approach is more
  | rational than that. They're probably looking at it from the
  | perspective that the business needs to have a profit margin of
  | X in order to justify investing in it.
  | 
  | They probably do understand that cutting costs impacts company
  | culture and morale. But shutting the company down probably
  | impacts that much more.
 
    | jscode wrote:
    | They do understand that cutting costs will have an impact on
    | culture and morale, they just think the marginal benefit
    | exceeds the marginal cost. Keep in mind, PEG managers are
    | chasing a carried interest bonus which they only achieve
    | after covering the minimum return promised to their
    | investors. Plus, leveraged buyouts--which PEGs frequently use
    | --increase a company's risk of failure. Everyone's under
    | intense pressure to perform.
    | 
    | Massive Financial Incentives + Highly Leveraged Balance Sheet
    | + Intense Pressure = Risky Decision Making
 
    | mcguire wrote:
    | " _But shutting the company down probably impacts that much
    | more._ "
    | 
    | Is that the _only_ other option?
 
  | jkingsbery wrote:
  | "Frupidity" is a term I've heard used for this.
 
  | Buttons840 wrote:
  | Interesting observation. The PEG managers would probably admit
  | that turnover had some financial cost, but I doubt they ever
  | actually put a number on it and added it to their calculations.
  | Am I right?
 
  | musicale wrote:
  | > private equity group
  | 
  | The goal of a private equity group can sometimes be be to
  | extract as much money from the company as possible in a given
  | time frame, rather than to ensure the long-term success or
  | survival of the company.
 
    | giva wrote:
    | And the goal of PEG's empolyees is to get bonuses by hitting
    | the metrics set for them.
    | 
    | Metrics are the only goal that means there. It's all about
    | making the numbers look pretty.
 
  | peteradio wrote:
  | I worked at a place with a lean 6 sigma certified specialist
  | who towards the end of the companies doom effectively had the
  | lead engineer cleaning out molding machines to track down every
  | last tiny molded part that over the course of several years of
  | continuous running had flung outside of its target. Same guy
  | told me if the coke machine ever stole my change that he'd help
  | me get it back from the vendor.
 
    | 7thaccount wrote:
    | All the lean sigma stuff seems like another useless
    | management fad to me that only benefits consultants. Is that
    | what you're saying here?
 
      | Spooky23 wrote:
      | It's an expression of distrust.
      | 
      | If you ever work in government, procurement people think
      | like that because their goal is objective competitive
      | process that that meets the minimum standard to fulfill the
      | purpose.
      | 
      | There's a certain logic to it. You don't want to see random
      | government employees driving around in Teslas, so generally
      | speaking they will be in nondescript 4-door sedans. Having
      | a human say "no" makes them accountable, so a complex
      | process will determine what kind of car you need.
      | 
      | Taken to extreme, it becomes a problem. Procurement
      | officers get lazy and focus on their process instead of the
      | needs of the customer. So they treat humans like Ford
      | Tauruses and allow vendors who understand how to game the
      | process walk out the door with millions.
 
      | peteradio wrote:
      | I'm only speaking towards this one particularly useless
      | buffoon, but the fact that he was allowed to wield any sort
      | of power over anyone says something.
 
      | thereddaikon wrote:
      | Like many management system fads, it started as a useful
      | kernel of wisdom or obvious maxim that idiots ran with and
      | turned into a monster. In the case of six sigma, the idea
      | is about constantly optimizing your workflows and not
      | accepting "we've always done it this way" as an excuse.
      | 
      | But most people lack the critical thinking skills to
      | correctly apply wisdom when necessary and instead need a
      | solid framework to operate within. That's how these things
      | inevitably develop. Just like how Agile is supposed to be
      | about getting working code over being bogged down in
      | process but inevitably ends up with with half baked
      | products that have massive issues.
      | 
      | Six Sigma also gets applied to industries it has no
      | business being in. The mentality works best when you have a
      | fixed workflow. In manufacturing it would be if you are
      | making a lot of one thing. You can do a lot of optimizing.
      | But I have seen it employed in organizations where every
      | project was vastly different. Instead of a lean approach it
      | should have, and previously was, following a house of
      | quality philosophy. This happened to be in an industry
      | where cost was rarely a consideration but performance and
      | reliability were.
 
        | 7thaccount wrote:
        | Great comment and thank you. It has only popped up
        | sparingly in my industry thankfully.
        | 
        | What you're saying makes sense to me in that you should
        | never fall into the "this is how I've always done it"
        | trap, but putting a complex beauracratic process around
        | that is just going to create a whole new problem.
 
  | jancsika wrote:
  | > The moral of the story is that people with analyst mindsets
  | play an essential role in our economy, but sometimes giving
  | those people power over large organizations can have disastrous
  | consequences.
  | 
  | I'm gonna cosplay an "analyst mindset":
  | 
  | 1. Need to measure costs and benefits of slashing benefits/pay.
  | 
  | 2. A benefit-- slashing benefits/pay allows us to hit some
  | obvious financial goal
  | 
  | 3. A cost-- Uh oh, I don't yet know how to reliably measure
  | _any_ of the costs.
  | 
  | 4. Good analysts don't take action without measuring.
  | 
  | 5. I'm a good analyst.
  | 
  | Conclusion: I cannot take the action of slashing benefits/pay
  | 
  | The only way to make it work is to add a step "3b: cherry pick
  | metrics for the costs of slashing benefits/pay such that the
  | phony metrics justify the decision management already wants to
  | make of slashing benefits/pay." But now we've shifted from
  | "analyst mentality" to "the mindset of the little Beetle-like
  | bureaucrats described by George Orwell in 1984."
 
    | jacobr1 wrote:
    | Having dealt with 2 PE exists, rarely is the proposal
    | something as upfront and silly as slash everyones pay. That
    | probably does happen for a company being restructured in the
    | red, but the more subtle actions tend to be things like:
    | 
    | * Comp bands for are now targeting p50 averages rather than
    | p75 or top of market. So you can't close new hires that are
    | going competitors. And you can't give raises to your top
    | performers
    | 
    | * The health benefits are less generous when renegotiated for
    | the following year
    | 
    | * T&E that would have been approved - granted some maybe that
    | shouldn't - but importantly some that should have for top-
    | sales people, are no longer approvable. So your top sales
    | people leave. Or similarly the accelerators or other measures
    | are changed, that might look good on paper but rub top sales
    | people the wrong way.
    | 
    | * Head-count isn't replaced, so teams have to take on more
    | work
    | 
    | * Perks like conference attendance or hardware upgrades,
    | which arguable aren't perks but investments in your team's
    | productivity, are cut/limited
 
      | bb88 wrote:
      | Cash is still King. The more cash on hand, the easier it is
      | for the business to survive in an economic downturn, and
      | the more dividends and stock buybacks can happen for the
      | investors.
      | 
      | Software engineers from a CFO perspective aren't any really
      | different than plumbers or carpenters. It's just labor.
      | Getting a cheaper rate on labor is far more beneficial to
      | the company than say making sure it's employees are happy.
      | 
      | If the company could cut wages 50% across the board and
      | then give the executives 20% raises for saving 50% in
      | labor, they would do it in a heartbeat.
 
        | Aeolun wrote:
        | > Getting a cheaper rate on labor is far more beneficial
        | to the company than say making sure it's employees are
        | happy.
        | 
        | Whoa, citation needed. I'm fairly certain yhis isn't true
        | unless you are optimizing for just the next month.
 
      | jscode wrote:
      | >> Having dealt with 2 PE exists, rarely is the proposal
      | something as upfront and silly as slash everyones pay.
      | 
      | Agreed, but a gradual erosion can occur over the course of
      | several years. PEGs and the operating company's management
      | team have massive incentives to hit their growth metrics.
      | If decreasing 401K contributions helps management hit their
      | EBITDA target, many would argue that they should do exactly
      | that. However, the impact of these decisions accumulates
      | over time and can eventually derail a company's
      | performance.
 
      | andrei_says_ wrote:
      | What were the longer term changes you saw and their
      | outcomes for the health of the teams and companies?
 
    | serverlessmom wrote:
    | I definitely agree with you. I would further argue that the
    | "analyst mentality" has become so engrossed in the foundation
    | of capitalism(and always has been, really) that we see
    | companies across many industries closing down due to the
    | "worker shortage". When in reality the CEO's of the companies
    | complaining the loudest that "no one wants to work" have
    | decided that paying living wages in a deeply competitive job
    | market is less doable than letting the company totally
    | stagnate and dissolve due to lack of people willing to work
    | hard for pennies.
 
| abhishekjha wrote:
| Does this have any QM implications?
 
| bell-cot wrote:
| "The moral is to the physical as three to one." - Napoleon I
 
| seanwilson wrote:
| > What McNamara didn't keep track of was the narrative of the
| war, the meaning that it had both within the military forces of
| each side, but also in the civilian populations of the nations
| involved.
| 
| I'm probably not following and not saying it's easy but aren't
| there some metrics you could track that with? How was it
| determined/measured later that the meaning was important?
| Couldn't you at least do polls in your own country?
 
  | asplake wrote:
  | But that's the point - it's not just how many, but what they
  | are willing and able to do, their positional and operational
  | strengths and weaknesses, and so on. Condensing all that into a
  | number is impossible.
 
| rossdavidh wrote:
| I believe the McNamara Fallacy is a real thing, although I would
| say Goodhart's Law expresses the more fundamental issue. But, the
| examples they give (Vietnam and Afghanistan wars) are bad
| examples, because in both cases the issue was primarily not that
| the U.S. military was relying too much on quantitative metrics,
| but rather that it was being asked to do a job which militaries
| are not good at.
| 
| In neither case, despite what many deniers and apologists
| proclaim, was the war "winnable" by the military, because in both
| cases the primary problem was that the government of the nation
| in question did not have the respect (and thus support) of its
| people. There is no strategy that an outside military can use,
| which will fix this, because it's not a military problem, and a
| military has the wrong personnel, the wrong training, and thus
| the wrong institutional mindset.
| 
| It is as if you sent the Red Cross in to defeat the Viet Cong or
| the Taliban, and then concluded that the reason they failed was
| that they were measuring numbers of hospital beds and counting
| how much medical equipment they needed. It was the political
| leadership, who decided this was a job for a military to perform,
| that was the root problem. McNamara's metrics were neither here
| nor there.
 
| shellback3 wrote:
| McNamara cut his teeth on real world problems in the AAF's Office
| of Statistics and worked with General Curtis LaMay to plan how
| best to use the B29 bombers. He became an expert with statistics
| and other math tools and thought they could be widely applied to
| business and, of course, war. Once he had these marvelous tools
| he found nails everywhere.
| 
| For instance the rate of German tank production was accurately
| estimated by collecting the serial numbers of all the tanks (and
| some tank components) that were knocked out. Best methods of
| attacking submarines was worked out this way as well as well as
| where to place armor in bombers, etc.
 
  | topspin wrote:
  | > Once he had these marvelous tools he found nails everywhere.
  | 
  | He also found two presidents and a sycophantic media that hung
  | on his every word.
  | 
  | His figures were excellent. In the earliest days of Vietnam,
  | long before anyone outside of Asia could find it on a map, he
  | produced eerily precise predictions of the costs -- in lives,
  | dollars and time -- of the future conflict. He told them what a
  | civil war in the jungle would look like and they pulled the
  | trigger.
  | 
  | As far as the value of measurement goes; I think most of the
  | low hanging fruit has been picked (a consequence of the
  | "Information Age") and what we struggle to 'measure' today is
  | far less tractable. As a result our measurements are frequently
  | corrupted in the service of prevailing agendas or not permitted
  | at all for fear of undesirable results.
 
| dempedempe wrote:
| This article is good, but not great - the author only gives one
| example of how quantitative-only reasoning can be bad (the
| example of the poppies). The other "example" is just the US
| military lying.
| 
| There are also no specific examples of non-quantitative reasoning
| that, if ignored, would be damaging.
| 
| I feel like the Wikipedia article does a better job explaining
| this: https://en.wikipedia.org/wiki/McNamara_fallacy
| 
| Also, why is there an entire webpage dedicated to this?
 
  | hummusandsushi wrote:
  | The webpage appears to be made to serve as a warning to data-
  | driven businesses to not fall into the same perverse set of
  | incentives that McNamara created, to encourage business
  | managers to diversify their accounts of the success of their
  | business beyond just the quantitative narrative.
 
    | pakitan wrote:
    | I'd be more cynical and suspect this is some kind of
    | elaborate SEO strategy. Submit the site to social networks ->
    | wait till gets some love from the Google algorithm -> put ads
    | on it -> profit.
 
      | hummusandsushi wrote:
      | Possible, but this seems of rather a niche interest for a
      | successful SEO strategy. I could grant that it would then
      | be targeted specifically at the HN sort of social networks.
 
      | mulmen wrote:
      | Why not just start with ads?
 
  | enkid wrote:
  | The example with the US military lying certainly demonstrates
  | another issue with over valuing metrics - making sure you have
  | good data. People who get wrapped around metrics also tend to
  | not look at the data they are being presented. One of the
  | issues McNamara had was he was receiving inflated body count
  | figures. [0] Not only was he measuring the wrong thing, he was
  | measuring it badly.
  | 
  | [0]
  | https://en.wikipedia.org/wiki/Vietnam_War_body_count_controv...
 
    | the_af wrote:
    | It's also a good example of perverse incentives or "gaming
    | the metrics" (in a horrifying way). If the main measure of
    | success was body count, and every kill was considered an
    | enemy by default, this encouraged just killing people and
    | counting them as successes.
 
      | shellback3 wrote:
      | I was in Vietnam in a Naval Support Group. I had
      | conversations with army officers that had combat experience
      | and learned that since no statistics were collected about
      | dead civilians all the dead were counted as enemy soldiers
      | - and everyone from the top down knew it.
 
| paulpauper wrote:
| I don't understands what the fallacy is. It is that an
| overreliance on data can overlook factors not in the data? Why is
| that a surprise. You would also have to show that not relying on
| data would generate better results.
 
  | warning26 wrote:
  | _> It is that an overreliance on data can overlook factors not
  | in the data? Why is that a surprise?_
  | 
  | The vast majority of PMs I've worked with don't seem to
  | understand this at all. If it's not in the metrics, it doesn't
  | exist!
 
  | 0xdeadbeefbabe wrote:
  | If understanding is what you seek then
  | https://www.youtube.com/watch?v=eNDsd798HR4 is a nice long
  | discussion on McNamara overlooking factors not in the data. The
  | fallacy didn't earn its name for simple reasons.
 
  | _Nat_ wrote:
  | It's probably one of those things that might seem more obvious
  | in simple cases, but might surprise folks in more complex
  | cases.
  | 
  | For example, a simple case: Say you go on vacation for a few
  | weeks with a certain amount of cash to spend. Upon arriving at
  | your destination, you immediately purchase some indulgence,
  | and, hey, you're feeling better! Why not immediately keep
  | spending as much as possible to maximize?
  | 
  | For example, a complex case: Say you're leading a country. You
  | set up policies that, after a few years, seem to have led to a
  | higher GDP than was previously expected. Does that suggest that
  | the policies are leading to a better future?
  | 
  | In the simple cases, like the vacation-example above, it's easy
  | enough to understand the scenario and what's going on. And we
  | can imagine that, hey, immediately spending all of the cash
  | might lead to poor consequences for the rest of the vacation,
  | even if they display some quick-satisfaction at first.
  | 
  | But in more complex cases, like with a country's policies
  | leading to a higher GDP, stuff can get trickier. We might say
  | that it's the lack of a top-level model: unlike in the vacation
  | scenario, where we were easily able to predict that there'd be
  | a lack of money later in the vacation, it might be harder to
  | say what else might be going on besides the GDP going up. And
  | all other things held equal, presumably a higher GDP would be
  | better than a lower GDP, and therefore all evidence points
  | toward the policies being a good idea, right?
 
    | enkid wrote:
    | GDP is actually a great example. It's actually very easy to
    | increase a country's GDP if that's the only thing you care
    | about. You just borrow more money and then spend it. GDP is
    | literally a measure of money changing hands inside a country.
    | If you borrow as much money as you can and then spend it on
    | things like infrastructure projects, you can instantly
    | increase the GDP. Banks see the growing GDP figure and assume
    | that its a good thing and let you borrow more money. That is,
    | until they don't.
    | 
    | This happened with Brazil in the 1970's, when the oil crisis
    | made Brazil believe that they were going to face a downturn.
    | To overcome this, they borrowed money and went on an
    | infrastructure spending spree. This made Brazil look like an
    | economic miracle, growing when everyone else was struggling.
    | This encouraged more banks to lend to Brazil. The problem is
    | the money was spent on short term growth instead of things
    | that would more systematically grow the economy over the long
    | term. The government (and lending banks) was substituting
    | year-to-year GDP numbers for economic health. Once credit
    | tightened in the early 1980's, Brazil's growth plummeted.
    | 
    | This is of course an over simplification, but I think we are
    | seeing similar problems in modern economies. People often
    | cite China's GDP growth, but they don't balance that out with
    | the debt they are taking on in order to finance that growth.
 
  | potatolicious wrote:
  | There are a bunch of factors and implications from the piece,
  | which I had hoped they'd go into. But some of the more obvious
  | implications which are often more organizational than they have
  | to do with the data per se:
  | 
  | - the data you are collecting may not include factors that are
  | consequential to outcomes. Treating these unmeasured factors as
  | inconsequential is hazardous. The piece specifically mentions
  | this effect. This effect is not a surprise but yet many
  | organizations fall into this trap so it seems to be worth
  | mentioning.
  | 
  | - the difficulty of measuring some factors will _result in
  | their exclusion_ from the dataset. Some things are
  | intrinsically hard to measure, and many organizations will as a
  | result refuse to measure them, and make decisions without them.
  | There needs to be a conscious and active process
  | organizationally to resist this and find effective ways of
  | measuring them.
  | 
  | - the difficulty of measuring some factors will _result in
  | easier but less useful proxies being used in their place_. Same
  | as above just with a somewhat different outcome. Organizations
  | are often blind to this happening as they come to believe the
  | proxy is as good as the real measure (or sometimes even that
  | the proxy _is_ the measure). A good industry example of this is
  | clickthrough rates being treated synonymously with audience
  | interest or content quality.
  | 
  | And a point I wish the piece made but did not:
  | 
  | - measuring something does not grant automatic understanding of
  | the phenomenon and gives you no predictive power. It's one
  | thing to quantifiably know the blue button gets more clicks
  | than the green button, but that grants you no insight into why.
  | Many tech companies fall into this trap - where despite
  | investing heavily in experimentation their modeling of the
  | product space doesn't improve over time, since they fail to
  | take the step to convert observation to generalizable
  | hypotheses that improve their model for the product and market.
  | 
  | This last effect IMO is _huge_ in our industry, and is why the
  | same low-level experiments are being re-done over and over
  | again. This creates more product churn and reduces your product
  | velocity - since the lack of proven product models means you
  | 're mostly flying blind and using ex-post-facto experimentation
  | on live users to figure out what to do.
  | 
  | Experimenting on button colors is all well and good, but the
  | end result of that data should be a color theory that explains
  | what colors to use when, not forever A/B testing every single
  | button color for the rest of time.
 
  | torginus wrote:
  | It probably tries to make the point that when a good metric
  | becomes a target, it ceases to be a good metric - either
  | through people gaming the metric, or because of diminishing
  | returns.
  | 
  | But I admit the article gets to its point in a muddy and
  | circuitous way, typical of business school anecdotes.
 
  | asveikau wrote:
  | > Why is that a surprise
  | 
  | It's surprising to many.
  | 
  | Many people will decide they are "data driven" and act like
  | this gives them infallibility and total correctness. Lots of
  | people need to be told this.
 
  | facorreia wrote:
  | My understanding is that the fallacy is cherry-picking a few
  | data points that are easy to measure and then claiming that
  | these are the most important data points, and then optimizing
  | execution for moving them in the "right" direction. This is
  | described in the metaphor of "searching under the streetlight".
  | The key to this con is to act confident and to challenge any
  | objections as lacking supporting data.
 
  | monkeybutton wrote:
  | I think the fallacy is when factors not captured or tracked by
  | data are treated as non-existing or consequential. In Vietnam,
  | they tried to quantify how pacified each village was as
  | measurement of success. It did not work. This isn't saying no
  | data is better, just that data on its own does not win a war,
  | or the hearts and minds of villagers.
  | 
  | See also:
  | https://www.theatlantic.com/technology/archive/2017/10/the-c...
 
| martincmartin wrote:
| Two examples come to mind:
| 
| - Investing your money with the fund manager who has the highest
| returns over the last year, or even 5 years, is called
| "performance chasing" and generally has worse returns than
| investing in an index. Think investing in dot com stocks in 1999,
| and real estate in 2007.
| 
| - In Moneyball, about using stats to improve a baseball team,
| they didn't care how many home runs someone hit. In fact, getting
| runs that way was considered bad. They wanted to get players on
| base and them get them home, even if through walks. There's a
| part of the book where a player is getting a lot of home runs,
| and therefore increasing the team's score, but the manager
| doesn't like it. "It's a process", "we have a process" he keeps
| saying.
 
| beaconstudios wrote:
| > In another case of military metrics gone wrong, the US military
| reported success in undermining Taliban financing after it paid
| Afghan farmers to destroy their crops of opium poppies. What went
| unreported, however, was that the farmers planted larger fields
| of opium poppies in response, in the hopes that they might be
| paid by the US military to destroy the crops again. When US
| payments didn't come through, the opium was harvested and entered
| the international drug trade. Much of the profit went to support
| the Taliban's anti-American military operations.
| 
| That's pretty ironic because its a near perfect replication of
| the Cobra effect, the canonical example for bad incentives
| leading to unintended outcomes:
| https://en.m.wikipedia.org/wiki/Perverse_incentive
 
| aeternum wrote:
| From a game-theory POV, all wars are caused by misunderstandings
| / partial information.
| 
| If each side really knew the true strength of the other side, it
| would be clear which would win and therefore actual war is
| illogical and unnecessary.
 
  | chasd00 wrote:
  | i'll add that there's a lot of people that see death as the
  | door to paradise and fighting for their cause as the key to
  | that door. No amount of logic or information is going to stop
  | them.
  | 
  | when you're dealing with the mentally ill, all bets are off.
 
  | aeturnum wrote:
  | This seems untrue.
  | 
  | There are also genuine uncertainties about the future: things
  | that _neither_ side knows but will impact the outcome. Because
  | of that, there is always a range of outcomes for a conflict,
  | and so both sides would still have an incentive to carry out
  | the war.
  | 
  | Consider, for instance, if there are two countries (A and B)
  | looking at conflict with perfect information. Given their
  | relative strengths, the war will cost both $1 billion, but A
  | will win and reap $1 billion + $1 in plunder. In your model, B
  | will always surrender and give A $1, but the reality is there
  | is a big spread about both the costs and the gains. Lots
  | depends on how well each side fights and responds, even with
  | perfect knowledge of the other.
  | 
  | So...you know, if it's the US versus...the Philippines[1], then
  | sure, the uncertain range of outcomes is small enough that
  | rational thing for the Philippines to do is surrender. But for
  | countries that are even reasonably well-matched, they would
  | want to fight with the goal of making the conflict unattractive
  | for the other side. Probably the best historical example of
  | this is the USSR in Afghanistan.
  | 
  | [1] I am sure the US would never invade and occupy them just
  | because the US found it convenient ;)
 
    | thaumasiotes wrote:
    | > Consider, for instance, if there are two countries (A and
    | B) looking at conflict with perfect information. Given their
    | relative strengths, the war will cost both $1 billion, but A
    | will win and reap $1 billion + $1 in plunder. In your model,
    | B will always surrender and give A $1
    | 
    | In the perfect information scenario, B will surrender more
    | than $1. B is better off at any tribute level up to
    | $1,000,000,000.
 
  | crazy1van wrote:
  | > If each side really knew the true strength of the other side,
  | it would be clear which would win
  | 
  | This is too simplistic. Often wars are heavily influenced by
  | things other than just the strengths of both sides. For example
  | - What if the ground hadn't been soaked from days of rain at
  | Agincourt and Waterloo? What if the Germans hadn't held back
  | their armor reinforcements for so long on D-Day?
 
    | asdff wrote:
    | They'd still lose. Wars are fought not from individual
    | battles but from logistics. Modern military's don't even
    | engage if there isn't a lopsided power imbalance favoring
    | their success.
 
      | chasd00 wrote:
      | if you find yourself in a fair fight it's time to re-think
      | your strategy
 
      | thaumasiotes wrote:
      | It only takes one side to engage.
 
        | asdff wrote:
        | Look at the orders of magnitudes differences in
        | casualties in desert storm, iraq, and afghanistan. When
        | Americans battle they ensure they have a massive power
        | advantage for every fight and this is abundantly clear
        | just from the casualty data. Even if the other side
        | engages first like in desert storm, Americans ensure that
        | they don't land boots unless they have a tactical
        | advantage which they had in Desert Storm thanks to
        | technology that Saddam and his army, which was one of the
        | largest and experienced land forces in the world at the
        | time, could not begin to compete with.
 
      | 0xdeadbeefbabe wrote:
      | Yeah like in Black Hawk Down.
 
        | asdff wrote:
        | In real life the battle of mogadishu saw Americans with
        | 10x fewer casualties. Most modern battles have numbers
        | like this with an order of magnitude fewer casualties on
        | the American side, because American logistic planners try
        | to ensure a massive power advantage.
 
  | hackeraccount wrote:
  | Is that true? What if the situation is that I am currently
  | stronger then you but can expect to become weaker in the
  | future? Why not force a war now if I can expect to win?
  | 
  | I still haven't finished it (too depressing) but my reading of
  | the book The Sleepwalkers about the events leading up to WWI is
  | that all the parties thought they were in that place; even if
  | they all had a full picture it's certainly possible those
  | beliefs would in fact be true.
 
    | aeternum wrote:
    | If I know that you are currently stronger and would
    | ultimately win a war, I should logically concede now since I
    | will lose anyway.
    | 
    | If I know that you are currently stronger but we both also
    | know that I can draw out a war such that I eventually become
    | stronger, then you shouldn't go to war since you will
    | ultimately lose.
 
      | vkou wrote:
      | > If I know that you are currently stronger and would
      | ultimately win a war, I should logically concede now since
      | I will lose anyway.
      | 
      | You'll often lose less if you fight to a loss.
      | 
      | In fact, you often won't have to fight at all if you make
      | it clear to your adversary that you will fight to a loss.
      | 
      | If I'm a street thug, I'm not going to try to mug someone
      | who clearly communicates that their response to a mugging
      | is detonating their suicide vest. I will, however,
      | repeatedly victimize someone who clearly communicate that
      | they are a pushover.
      | 
      | This is an iterative game, against the same players. Loss
      | aversion is not a winning strategy to pursue in the long-
      | term. Something resembling tit-for-tat is.
 
  | vlovich123 wrote:
  | That's a fallacious line of reasoning on several fronts.
  | 
  | A) The beneficiaries of a war need not be the ones who pay the
  | cost. If you personally stand to benefit from the war itself,
  | regardless of win or lose, you want the war.
  | 
  | B) It's applying a rational economics approach when it's pretty
  | clear that humans aren't rational. Japan kept fighting even
  | when it was clear they would lose. Pride, image, your future
  | after the war. All things that impact the decision that gets
  | made.
  | 
  | C) War isn't typically total like it was in WWII. Most wars are
  | just skirmishes or prolonged skirmishes that don't meaningfully
  | change borders. In such a scenario, a single "war" loss may
  | still be an important fight to have in the middle of the
  | overall conflict (i.e. to signal to your opponent that you're
  | not a pushover and that continuing on their current path has
  | consequences).
  | 
  | D) Win/lose isn't the only outcome of war. Lose/lose is also
  | another scenario. Additionally, you might lose the war but
  | winning may not be the goal. For example, maybe it only takes
  | me 1/10th the resources for me to wage the war that it takes
  | for you to win. A single war might cost me 100 units, but it's
  | important for me to be able to cause you to spend 1000 units
  | because then I have a 900 unit advantage in some other area.
  | 
  | There's many more, but I don't think imperfect information is
  | the only reason war exists.
 
    | asdff wrote:
    | I really don't think the GP is wrong here. Yes there are
    | ideologues, but there were plenty in the Japanese high
    | command who understood logistics and understood that war with
    | the U.S. would end the Japanese empire. It was the same way
    | with Germany. The people who understood logistics knew it
    | would be absolutely impossible to win a two front war and
    | nearly impossible to win a one front war with the U.S. having
    | a thumb on the scales just with lend lease alone. Germany
    | sent their best generals to the east and they knew they had
    | no hope but to delay the inevitable, and no alternative but
    | to listen to Hitler. They continued to fight because to deny
    | Hitler's madness often meant less than favorable outcomes for
    | these commanders than making peace with Allies in 1945. The
    | war in Japan ended when it became impossible for the emperor
    | to continue with the cognitive dissonance and ignore the fact
    | that the fight had ended years before it even began.
    | Logistics have won every war there ever was.
 
    | tomrod wrote:
    | (1) Externalities are handled in the game theory literature
    | 
    | (2) Preference definitions are not in scope for most game
    | theory lit; what you're describing are preferences and
    | expected utility, which ARE in the literature and metamodel
    | 
    | (3) This is not really true. It depends on how big you are
    | and how big your coalition is. Consider that Saddam Hussain
    | is no longer in charge of Iraq.
    | 
    | (4) Yes, potential outcomes are considered under game theory.
    | 
    | I get your point that OP's statement feels reductionist, but
    | I wanted to explain that the OP is actually correct.
 
  | [deleted]
 
  | s28l wrote:
  | There is a lot to disagree with here. First, the article itself
  | provides an example of an army with a superior fighting force
  | that had far fewer casualties than the other side, yet still
  | lost the war. That was due to the civilians back at home
  | lacking the stomach for a long, protracted war abroad. So even
  | if one side has superior strength, and even if the side with
  | the superior strength wins every battle, they might still lose
  | the war.
  | 
  | Another issue is your implicit assumption that the side with
  | the superior fighting force will always win the war, but I
  | don't think you can make that assumption. Just like the better
  | football team can lose to the underdog, there is a stochastic
  | element to warfare. A sudden bit of bad weather can turn the
  | tide of a battle. There are countless other elements that are
  | unobservable and unpredictable that can decide which side wins.
  | 
  | There are also asymmetric payoffs to going to war. A nation
  | might have a slim chance of victory, but the cost of defeat or
  | surrender might be genocide or subjection. How do you assign a
  | payoff to the choice "surrender" when the outcome is the
  | destruction of your nation as it once existed?
  | 
  | [0]
  | https://en.wikipedia.org/wiki/Strategy_(game_theory)#Pure_an...
 
  | 13415 wrote:
  | Wait a minute. Shouldn't the goal of the defender be to make it
  | clear to the attacker that they would incur losses that
  | outweigh possible wins for the attacker, thereby making the
  | attack a lose-lose scenario? This is only possible if the
  | defender makes it credible to the potential attacker that they
  | will defend their country even when the costs would be
  | extremely high and they cannot possibly win the conflict. In
  | other words, at least from deterrence point of view it's not
  | about winning, it's about making sure the attacker would
  | overall lose more than gain (which is not the same as winning
  | against the attacker).
 
| sf_rob wrote:
| This isn't a great article IMO (I say as someone who is a big fan
| of qualitative user research, mixed methods, etc).
| 
| Per the first example, many types of attitudinal data can be
| quantified and conflating attitudinal data with qualitative data
| is itself a fallacy. It's possible that there were quantitative
| attitudinal signals that could have been captured or created as
| inputs to a more accurate model.
| 
| Per the second example, this is more a question of data validity
| than the metric itself. If the metric could be validated through
| better design and gamification prevented then it would likely
| still be a helpful indicator. Granted this is a very hard
| problem.
 
  | serial_dev wrote:
  | My thoughts, too. It's not that quantitative data is bad, he
  | just picked the wrong one.
 
| scubbo wrote:
| Surprised not to see Goodhart's Law[0] referenced here - "When a
| measure becomes a target, it ceases to be a good measure". Not
| the same concept, but a related one (as is the Cobra Effect[1],
| of which the poppy-field burning is an example)
| 
| [0] https://en.wikipedia.org/wiki/Goodhart's_law [1]
| https://en.wikipedia.org/wiki/Perverse_incentive#The_origina...
 
  | musicale wrote:
  | From [1]: "It was discovered that, by providing company
  | executives with bonuses for reporting higher earnings,
  | executives at Fannie Mae and other large corporations were
  | encouraged to artificially inflate earnings statements and make
  | decisions targeting short-term gains at the expense of long-
  | term profitability."
  | 
  | What a shocking revelation! Could it possibly apply to other
  | companies that report quarterly results? ;-)
 
    | scubbo wrote:
    | I vividly remember a "small-hands" (smaller all-hands) very
    | early in my career, where the VP basically said "we recognize
    | that oncall is a burden, and we want to compensate you for
    | it. At first we thought we should give you a bonus related to
    | how many times you get paged - but then we realized that
    | that's incentivizing you to build flaky systems. Then we
    | considered giving a bonus in inverse relation to how often
    | you're paged - but then we though, no, they're smart
    | engineers, they'll just turn off the monitoring and collect
    | the bonus. So we're giving you a flat rate".
    | 
    | In the absence of some untamperable objective way to measure
    | service health (the concept of SLAs was a distant dream at
    | that point), can't fault that reasoning, tbh!
 
| csours wrote:
| A different view of the problem:
| 
| https://www.nngroup.com/articles/campbells-law/
 
| snidane wrote:
| I wonder if it gets widespread to oppose the dominant
| hyperquantitative, every fart tracked in jira, can't measure -
| can't manage trend.
| 
| Which was never pushed by anybody from the industry, but got
| spread by project management frauds.
| 
| https://deming.org/myth-if-you-cant-measure-it-you-cant-mana...
 
| skybrian wrote:
| Maybe a better way of thinking about it is that a process that
| gathers numbers can be useful, but understanding the context in
| which the numbers were gathered (methodology and whatever else is
| happening at the time) is more important than the numbers
| themselves. Without context, you don't know what you have.
| 
| For example, surveys gather numbers, but if you don't understand
| how the people who answered the questions interpreted them, you
| don't know what the numbers mean. Asking people what they really
| meant by their answer is only possible for the people doing the
| survey.
 
| hwers wrote:
| At this point "fallacy" just strikes the same note to me as
| "misinformation". It's mostly used as a political device to get
| ideas you want to be true into social consensus but the reality
| is usually more ambiguous.
 
| [deleted]
 
| NovemberWhiskey wrote:
| Or, put slightly more succinctly: "you get what you measure"
 
| shellback3 wrote:
| It should be remembered that McNamara cut his teeth working for
| the office of statistics with Curtis LeMay in the use of B29
| aircraft. He was proud that he was part of a larger effort using
| statistics and other math that helped to win the war and, with
| that hammer, most everything could be a nail.
 
___________________________________________________________________
(page generated 2022-02-01 23:00 UTC)