|
|
Subscribe / Log in / New account

The dark side of expertise

Did you know...?

LWN.net is a subscriber-supported publication; we rely on subscribers to keep the entire operation going. Please help out by buying a subscription and keeping LWN on the net.

By Jake Edge
January 15, 2020
LCA

Everyone has expertise in some things, which is normally seen as a good thing to have. But Dr. Sean Brady gave some examples of ways that our expertise can lead us astray, and actually cause us to make worse decisions, in a keynote at the 2020 linux.conf.au. Brady is a forensic engineer who specializes in analyzing engineering failures to try to discover the root causes behind them. The talk gave real-world examples of expertise gone wrong, as well as looking at some of the psychological research that demonstrates the problem. It was an interesting view into the ways that our brains work—and fail to work—in situations where our expertise may be sending our thoughts down the wrong path.

Brady began his talk by going back to 1971 and a project to build a civic center arena in Hartford, Connecticut in the US. The building was meant to hold 10,000 seats; it had a large roof that was a "spiderweb of steel members". That roof would be sitting on four columns; it was to be built on the ground and then lifted into place.

[Dr. Sean Brady]

As it was being built, the contractors doing the construction reported that the roof was sagging while it was still on the ground. The design engineers checked their calculations and proclaimed that they were all correct, so building (and raising) should proceed. Once on the columns, the roof was bending and sagging twice as much as the engineers had specified, but after checking again, the designers said that the calculations were all correct.

Other problems arose during the construction and, each time the contractors would point out some problem where the design and reality did not mesh, the designers would dutifully check their calculations again and proclaim that all was well. After it went up, Hartford residents twice contacted city government about problems they could see with the roof; but the engineers checked the calculations once again and pronounced it all to be fine.

In 1978, the first major snowstorm since the construction resulted in an amount of snow that was only half of the rated capacity of the roof—but the roof caved in. Thankfully, that happened in the middle of the night; only six hours earlier there had been 5,000 people in it for a basketball game.

So, Brady asked, what went wrong here? There were "reasonably smart design engineers" behind the plans, but there were also multiple reports of problems and none of those engineers picked up on what had gone wrong. In fact, there seemed to be a reluctance to even admit that there was a problem of any kind. It is something that is seen in all fields when analyzing the causes of a failure; it turns out that "people are involved". "We have amazingly creative ways to stuff things up."

Expertise

Before returning to finish the story about the arena, Brady switched gears a bit; there are lots of different human factors that one could look at for failures like that, he said, but he would be focusing on the idea of expertise. Humans possess expertise in various areas; expertise is important for us to be able to do our jobs effectively, for example. We also tend to think that more expertise is better and that it reduces the chances of mistakes. By and large, that is correct, but what if sometimes it isn't? "The greatest obstacle to knowledge is not ignorance ... it is the illusion of knowledge", he said, quoting (or paraphrasing) a famous quote.

[Müller-Lyer optical illusion]

Before digging further in, he wanted to show "how awkward your brain is". He did so with a variant of the Müller-Lyer optical illusion that shows two lines with arrows at the end, one set pointing out and the other pointing in (a version from Wikipedia can be seen on the right). The straight line segments are the same length, which he demonstrated by placing vertical lines on the image, even though that's not what people see. He asked the audience to keep looking at the slide as he took the lines away and restored them; each time the vertical lines were gone, the line with inward pointing arrows would return to looking shorter longer than the other. "It's like you learned absolutely nothing", he said to laughter. Your brain knows they are the same length, but it cannot make your eye see that.

Mann Gulch

A similar effect can be seen in lots of other areas of human endeavor, he said. He turned to the example of the Mann Gulch forest fire in 1949 in the US state of Montana. A small fire on the south-facing side of a gulch (or valley) near the Missouri River was called in and a team of smokejumpers was dispatched to fight it before it could really get going.

Unfortunately, the weather conditions, abnormally high temperatures, dry air, and wind, turned the small fire into an enormous conflagration in fairly short order. In less than an hour after the smokejumpers had gathered up the equipment dropped from the plane (and found that the radio had not survived the jump due to a malfunctioning parachute), the firefighters were overrun by the fire and most of them perished.

The foreman of the team, Wagner Dodge, followed the generally accepted practices in leading the men to the north-facing slope, which was essentially just covered with waist-high grass, and then down toward the river to get to the flank of the fire. From what they knew, the fire was burning in the heavily timbered slope on the other side of the gulch. As it turned out, the fire had already jumped the gulch and was burning extremely quickly toward them, pushed by 20-40mph winds directly up the gulch into their faces. Once he recognized the problem, Dodge realized that the team needed to head up the steep slope to the top of the ridge and get over to the other side of it, which was complicated by the presence of a cliff at the top that would need to be surmounted or crossed in some fashion.

When they turned back and started up the ridge, the fire was 150 yards away and moving at 3mph; in the next 12-14 minutes it completely overtook the team. The men were carrying heavy packs and equipment so they were only moving at around 1mph on the steep slope. Dodge gave the order for everyone to drop their equipment and packs to speed their way up the slope, but many of the men seemed simply unable to do that, which slowed them down too much.

It took many years to understand what happened, but the fire underwent a transformation, called a "blow up", that made it speed up and intensify. It was burning so hard that a vacuum was being created by the convection, which just served to pull in even more air and intensify it further. It was essentially a "tornado of fire" chasing the men up the slope and, by then, it was moving at around 7mph.

Once Dodge realized that many of them were not going to make it to (and over) the ridge, he had to come up with something. For perhaps the first time in a firefighting situation, he lit a fire in front of them that quickly burned a large patch of ground up and away from the team. His idea was that the main fire would not have any fuel in that area. He ordered the men to join him in that no-fuel zone to hunker down and cover themselves as the fire roared past them, but none could apparently bring themselves to do so. Only the two youngest smokejumpers, who had reached the ridge and miraculously found a way through a crevice in the cliff in zero visibility, survived along with Dodge. Thirteen men died from the fire.

There are two things that Brady wanted to focus on. Why did the men not drop their tools and packs? And why didn't they join Dodge in the burned-out zone? If we can answer those questions, we can understand a lot about how we make decisions under pressure, he said.

Priming

In order to do that, he wanted to talk about a psychology term: priming. The idea is that certain information that your brain takes in "primes" you for a certain course of action. It is generally subconscious and difficult to overcome.

There was a famous experiment done with students at New York University that demonstrates priming. The students were called into a room where they were given a series of lists of five words that they needed to reorder to create four-word sentences, thus discarding one word. The non-control group had a specific set of words that were seeded within their lists; those words were things that normally would be associated with elderly people.

They then told the students to go up the hallway to a different room where there would be another test. What the students didn't know was that the test was actually in the hallway; the time it took each participant to walk down the hall was measured. It turned out that the students who had been exposed to the "elderly words" walked more slowly down the hall. Attendees might be inclined to call that "absolute crap", Brady suggested, but it is not, it is repeatable and even has a name, "the Florida effect", because Florida was used as one of the words associated with the elderly.

It seems insane, but those words had the effect of priming the participants to act a bit more elderly, he said. So to try to prove that priming is real, he played a different word game with the audience; it is called a "remote associative test". He put up three words on the screen (e.g. blue, knife, cottage) and the audience was to choose another word that went with all three (cheese, in that case). The audience did quite well on a few rounds of that test.

But then Brady changed things up. He said that he would put up three words, each of which was followed by another in parentheses (e.g. dark (light), shot (gun), sun (moon), answer: glasses); he told everyone to not even look at the parenthesized words. When he put the first test up, the silence was eye-opening. The words in parentheses, which no one could stop themselves from reading, of course, would send the brain down the wrong path; it would take a lot of effort to overcome the "negative priming" those words would cause. It is, in fact, almost impossible to do so.

The tests were designed by "evil psychologists" to send your brain down the wrong solution path, he said; once that happens, "you cannot stop it". "We are not nearly as rational as we think we are". If he repeated the test later without the extra negative-priming words, people would be able to come up with the right solution because their brain had time to forget about the earlier path (and the words that caused it). This is the same effect that causes people to find a solution to a problem they have in the shower or on a walk; the negative-priming influence of their work surroundings, which reinforce the non-solution path they have been on, is forgotten, so other solution paths open up.

"So at this point you might say, 'hang on Sean, those are some fancy word games, but I'm a trained professional'", he said to laughter. He suggested that some in the audience might be thinking that their expertise would save them from the effects of negative priming. Some researchers at the University of Pittsburgh wanted to test whether our expertise could prime us in the way that the parenthesized words did. They designed a study to see if they could find out.

They picked a control group, then another group made up of avid baseball fans, and did a remote associative test with both groups. Instead of putting words in parentheses, though, they allowed the baseball fans to prime themselves by using words from common baseball phrases as the first word in the test; that word was deliberately chosen to send them down an incorrect solution path.

For example, they would use "strike", "white", and "medal"; a baseball fan would think of "out", which works for the first two, but not the last, and they would get stuck at that point. Those who don't have baseball expertise will likely end up on the proper solution, which is "gold". As might be guessed, the baseball fans "absolutely crashed and burned" on the test. Interestingly, at the end of the test they were asked if they used their baseball knowledge in the test, but they said: "No, why would I? It had nothing to do with baseball." The expertise was being used subconsciously.

In another test, they would warn the baseball fans ahead of time that the test was meant to mess with their head and use their baseball knowledge against them, so that they should not use that knowledge at all for the test. They did just as poorly compared to the control group, which showed that the use of expertise is not only subconscious, but it is also automatic.

Back to the fire

Brady then circled back to the forest fire; the men in Mann Gulch "can no sooner drop their firefighting expertise than the baseball fans could". They could not drop their physical tools and they could not drop their mental tools that told them they had to get to the ridge. They also could not accept new tools, he said; when Dodge showed them the ash-covered area that the new fire had created, they did not accept it as a new tool, instead they "defaulted to their existing expertise and worked with the tools they had".

There is a name for this, he said, it is called "The Law of the Instrument": "When all you have is a hammer, everything looks like a nail." We are all running around with our respective hammers looking for nails to hit. "We see the world through the prism of our expertise and we cannot stop ourselves from doing so."

After Mann Gulch, firefighters were told that if they got into a situation of that sort, they should drop their tools and run, but that still did not work. There were fatalities where firefighters were close to their safe zones but found with their packs still on and holding their chainsaws. The next step was to properly retrain them by having them run an obstacle course with and without their gear, then showing them how much faster they could run without it. It sounds silly, Brady said, but it worked because it gave them a new tool in their mental toolbox.

The one exception at Mann Gulch, though, is Dodge, who dropped both his physical and mental tools. He came up with a new tool on the spot; "escape fires" became part of the training for firefighters after Mann Gulch. How did that happen? Psychologists have a term for this as well, it is called "creative desperation"; when their back is truly to the wall, some will recognize that their expertise is not working and will not solve the problem at hand. At that point they drop their tools and see the facts for what they are, which allows them to find a solution that was outside of the path their expertise was leading them down.

Brady then returned all the way to the beginning and the Hartford civic center roof collapse. Even though there were repeated warnings that something was wrong with the design of the roof, the engineers defaulted to their expertise: "Our calculations say it's OK, so it must be OK."

This was the early 1970s, he said, why were these engineers so confident in their calculations? As guessed by many in the audience, the reason for that was "computers". In fact, when they won the bid, they told the city of Hartford that they could save half a million dollars in construction costs "if you buy us this new, whiz-bang thing called a computer". It turned out that the computer worked fine, but it was given the wrong inputs. There was an emotional investment that the engineers had made in the new technology, so it was inconceivable to them that it could be giving them the wrong answers.

He concluded by saying that no matter what field we are in, we will all encounter situations where our expertise is not a perfect fit for the problem at hand. It is important to try to recognize that situation, drop the tools that we are trying to default to, and see the facts for what they are, as Dodge did in Mann Gulch. He ended with a quote from Lao Tzu: "In pursuit of knowledge, every day something is acquired. In pursuit of wisdom, every day something is dropped."

It was an engaging, thought-provoking talk, which is generally the case with keynotes at linux.conf.au. Brady is a good speaker with a nicely crafted talk; there is certainly more that interested readers will find in the YouTube video of his presentation.

[I would like to thank LWN's travel sponsor, the Linux Foundation, for travel assistance to Gold Coast for linux.conf.au.]

Index entries for this article
Conferencelinux.conf.au/2020


(Log in to post comments)

The dark side of expertise

Posted Jan 15, 2020 21:45 UTC (Wed) by butcher (guest, #856) [Link]

+1!

EVERY single failure investigation I've worked in my 18 years of aerospace has gone a different way than what I originally surmised. I believe experience helps in the effective structuring of an investigation, but only observation and measurement will tell you the truth...

The dark side of expertise

Posted Jan 17, 2020 5:00 UTC (Fri) by smitty_one_each (subscriber, #28989) [Link]

When I look at an error report, I try to keep in mind the non-zero likelihood that whatever the system returned as an error may or may not have anything to do with the actual problem.

It's just so easy to get tunnel vision and head off in the wrong direction when troubleshooting.

The dark side of expertise

Posted Jan 17, 2020 7:37 UTC (Fri) by madhatter (subscriber, #4665) [Link]

Very true. Once upon a time my favourite Oracle DBA turned to me, head in hands, and said "Somehow, somewhere, something has gone wrong". It's still one of the best problem reports I've ever received, since it embedded no assumptions, but instead provided me an opportunity to ask questions to home in on the precise nature, and then the likely causes, of the fault.

When I left that job, I had three t-shirts made up, for me, him, and the Windows admin, with those six words on, just to remind us of how helpful a non-misleading problem report can be.

The dark side of expertise

Posted Jan 17, 2020 21:01 UTC (Fri) by smitty_one_each (subscriber, #28989) [Link]

Calls for a celebratory haiku:

Somehow and somewhere
Something has gone wrong. I can
Be less specific.

The dark side of expertise

Posted Jan 15, 2020 22:16 UTC (Wed) by smoogen (subscriber, #97) [Link]

Thank you for that detailed summary of the talk. It was very helpful and got me to look up some new things.

The dark side of expertise

Posted Jan 15, 2020 22:18 UTC (Wed) by admalledd (subscriber, #95347) [Link]

For reasons I have been exposed to that word-priming game before, and even here in pure text as the author puts it I can get stuck far longer than I would hope. Going into it knowing what is about to happen (and even seeing the same examples again dang it!) still doesn't always help you.

The dark side of expertise

Posted Jan 15, 2020 22:35 UTC (Wed) by dmiller (guest, #115155) [Link]

Subsequent attempts with larger sample sizes have failed to replicate both of the priming studies mentioned [0][1], which casts considerable doubt on their findings.

[0] https://journals.plos.org/plosone/article?id=10.1371/jour...
[1] https://journals.sagepub.com/doi/abs/10.1177/174569161875...

The dark side of expertise

Posted Jan 16, 2020 13:44 UTC (Thu) by mikapfl (subscriber, #84646) [Link]

Thank you for these links, very insightful. Especially the first paper is really interesting, because it does find an effect, but it is mediated via the experimentator, which is also pretty mind-boggling. (%

The dark side of expertise

Posted Jan 19, 2020 16:51 UTC (Sun) by emk (subscriber, #1128) [Link]

There's a nice summary of priming research here, and just how much of it has been recently been cast into doubt: https://replicationindex.com/2017/02/02/reconstruction-of...

I suppose this leads to another failure mode of expertise: Trusting what you find in journals too uncritically.

That's not to say that expertise won't occasionally lead you astray while troubleshooting, of course.

The dark side of expertise

Posted Jan 21, 2020 4:57 UTC (Tue) by ssmith32 (subscriber, #72404) [Link]

> That's not to say that expertise won't occasionally lead you astray while troubleshooting, of course.

Or when running / interpreting social psychology experiments ;)

The dark side of expertise

Posted Jan 15, 2020 22:42 UTC (Wed) by saffroy (guest, #43999) [Link]

On the same topic, I would recommend the recent book "Range" by David Epstein, which gives more reasons for paying attention to the value of non-experts (it also uses the example of the Mann Gulch fire, with many others).

The dark side of expertise

Posted Jan 15, 2020 23:18 UTC (Wed) by me@jasonclinton.com (guest, #52701) [Link]

Yes, second this so much! Such a great book!

The dark side of expertise

Posted Jan 16, 2020 15:19 UTC (Thu) by rahvin (guest, #16953) [Link]

One of the things we do to counter something like this in Civil Engineering is an independent technical review. Basically a review by a participant from outside the project that is subject to none of the "institutional project knowledge" that may bias the design unintentionally. These reviews can be especially helpful where the design specifications have changed multiple times because the outside reviewer can act as a check on each of those decisions.

Good talk, and an excellent summary, Forensic Engineering is an interesting field to me.

The dark side of expertise

Posted Jan 16, 2020 8:38 UTC (Thu) by marcH (subscriber, #57642) [Link]

Thanks! That explains why we all think we know what to optimize before having even started to measure anything :-)

The dark side of expertise

Posted Jan 16, 2020 12:29 UTC (Thu) by fcrozat (subscriber, #175) [Link]

For people interested, this keynote recording is already available at https://www.youtube.com/watch?v=Yv4tI6939q0

The dark side of expertise

Posted Jan 16, 2020 12:30 UTC (Thu) by Archimedes (subscriber, #125143) [Link]

In germany there is an interesting double booking of the word "kompetenz" (competence) in the public sector.
In agencies this word is used defining the authority of your "job", not if you have the expertise to fullfil this "job".

The Hartford disaster/incident shows also a failure of the "other meaning", as the authority of each entity was limited it ended up in a back and forth loop, as nobody took the authority to exit this loop. (The exit was time/finished building something broken)
This does not imply that if that exit of that loop would have existed the problem would not have happened, there only would have been a chance in so to say a reshuffled set of expertise and authority that the end result would have been the same.

I have seen quite a few of projects where each entity was locally correct. None of the entities had all the information to check (or even in a retrospective) if they are also globally correct(ish) and none was able to/was allowed to/or took the initiative. In the end the project failed on its initial goal, but was "saved" by exactly an "out of the box"/"out of project" way to solve or circumvent it. By "other expertise" or "other authority" or of course some mixture of both of them.

It is kind of possible to read this article with using the german kompetenz in mind or the authority and ending up in the same conclusions (only the examples don't fit of course)

P.S.
I like people which honestly can play chess (or similar) against themselves and win and loose in the end, as these can (somewhat) honestly can look at both sides of a solution/problem/situation. So kind of the ones which see screws even when they only have a hammer ...

The dark side of expertise

Posted Jan 16, 2020 15:26 UTC (Thu) by mathstuf (subscriber, #69389) [Link]

Some of the episodes in Cautionary Tales[1] are relevant to this (particularly the Piano episode). I can recommend all of them in any case. I imagine his book(s) are also relevant, but I haven't read them.

[1] http://timharford.com/articles/cautionarytales/

The dark side of expertise

Posted Jan 16, 2020 16:27 UTC (Thu) by kmweber (guest, #114635) [Link]

Beside the point, I know, but am I the only one who sees the lines with inward-pointing arrows as longer?

The dark side of expertise

Posted Jan 16, 2020 17:09 UTC (Thu) by kid_meier (subscriber, #93987) [Link]

No I see the line with both inward pointing arrows as longer too.

The dark side of expertise

Posted Jan 16, 2020 19:47 UTC (Thu) by am (subscriber, #69042) [Link]

No, that's how most people perceive it. From Wikipedia:
The line segment forming the shaft of the arrow with two tails is perceived to be longer than that forming the shaft of the arrow with two heads.

The dark side of expertise

Posted Feb 1, 2020 0:48 UTC (Sat) by akupries (subscriber, #4268) [Link]

There is also a dynamic variant of this illusion:

The dark side of expertise

Posted Jan 16, 2020 20:14 UTC (Thu) by jake (editor, #205) [Link]

> but am I the only one who sees the lines with inward-pointing arrows as longer?

oops, no ... not sure how that slipped through, but I have adjusted the text.

thanks,

jake

The dark side of expertise

Posted Jan 16, 2020 23:52 UTC (Thu) by monty55 (subscriber, #102528) [Link]

See also "educated incapacity", originally "trained incapacity":

https://www.hudson.org/research/2219-the-expert-and-educa...

The dark side of expertise

Posted Jan 17, 2020 0:55 UTC (Fri) by gerdesj (subscriber, #5446) [Link]

Jake - cracking write up. Obviously something piqued your inner journo here.

My favourite Civ Eng screw up to add to the Struct Eng buggeration mentioned already is this:

The Tacoma Narrows bridge breakup is a classic. I was a Civ Eng student at Plymouth Polytechnic (Devon, UK) in 1990ish and it was taught as a lesson pretty early on. The basic lesson is resonance - it shook itself to bits. When a large military force marches across a bridge, they break step. Resonance - the synchronised steps of the soldiers can soon cause structures like bridges of various designs to fail in ways that the designer never thought of.

The London Millenium bridge - https://en.wikipedia.org/wiki/Millennium_Bridge,_London was an accident waiting to happen that all of the designers of the bloody thing would have been taught about or heard of.

I only studied Civ Eng. I ended up in IT but I clearly remember seeing the new bridge opening on TV and thinking it will suffer from lateral stability issues of some sort. How the hell can a near amateur see what the experts couldn't. Don't get me wrong, the design is a pretty clever spin on the suspension bridge by putting the suspension structure off to the side but that means that you lose vertical control, which also often determines lateral control. I've made up those terms because it is quite hard to describe how a 3D structure works. You sort of see it in your head and know when it should work. The actual numbers need quantifying but you can see obvious oddities. Despite being an IT bod I can still chase a 3D frame and see the stresses and strains in my head in terms of direction and a taste of magnitude.

Now I come to look at it that is probably not too normal. Are there any real Structural or Civil Engineers out there that can provide some insight?

The dark side of expertise

Posted Jan 17, 2020 1:16 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link]

> The Tacoma Narrows bridge breakup is a classic. I was a Civ Eng student at Plymouth Polytechnic (Devon, UK) in 1990ish and it was taught as a lesson pretty early on. The basic lesson is resonance - it shook itself to bits.
Not quite. It was caused by flutter, a self-reinforcing behavior when a shape can conform to aerodynamic forces, and enter a mode where the forces can start to amplify.

It was not a simple resonance, no amount of marching soldiers could have collapsed the bridge.

The dark side of expertise

Posted Jan 17, 2020 1:31 UTC (Fri) by gerdesj (subscriber, #5446) [Link]

As you say, not soldiers, in the case of the Tacoma Narrows bridge it was wind. I didn't mention that because it was self evident in my head!

Resonance was still the fault. You call it flutter and that was the input but not the cause. The wind made it shake a bit, then a bit more, then a lot more then it shook itself to bits in a quite horrible failure.

As a result the cross section of suspension bridges was significantly changed. I can't quite remember but I would imagine that they now have negative lift ie the faster the wind, the more down force. Extra 5 mm of steel all round to account for the extra force (I made 5 mm up but probably not too far off).

The dark side of expertise

Posted Jan 23, 2020 20:48 UTC (Thu) by Wol (subscriber, #4433) [Link]

Yup, I remember the Millenium Bridge. A classic example.

As I remember it, all the textbooks talk about VERTICAL resonance. And nobody thought of HORIZONTAL resonance. Plus, this same problem had recurred at maybe 20-year intervals, and each time it was rather hushed up out of embarrassment that the problem wasn't spotted. With the result that it didn't get into the literature, didn't get into the engineering consciousness, and got repeated again and again.

Cheers,
Wol

The dark side of expertise

Posted Jan 23, 2020 21:16 UTC (Thu) by Wol (subscriber, #4433) [Link]

> Resonance was still the fault. You call it flutter and that was the input but not the cause. The wind made it shake a bit, then a bit more, then a lot more then it shook itself to bits in a quite horrible failure.

Except we probably need to check Snopes. I remember seeing some program that actually investigated the failure, and wind alone could not cause what happened.

Yes it shook itself to bits, but there was some prior failure without which it would not have been anywhere near as bad.

Cheers,
Wol

The dark side of expertise

Posted Jan 24, 2020 9:53 UTC (Fri) by gerdesj (subscriber, #5446) [Link]

The Millennium Bridge failure was pinned on people walking in lock-step. The failure was still due to resonance. This one was even more embarrassing than the Tacoma Narrows disaster. Everyone + dog knows that soldiers are told to break stride when crossing a bridge. I think that effect was known about from the 19th C.

When people walked across the bridge, it began to sway laterally. This swaying makes people start to sync up their walking rhythm. You will almost certainly have experienced this somewhere on a small scale. The bridge now has several pistons attached to damp down the lateral movement. Any physicist will be able to describe and quantify the damped oscillation that they learned about at school, aged about 17!

The dark side of expertise

Posted Jan 24, 2020 19:55 UTC (Fri) by Wol (subscriber, #4433) [Link]

> When people walked across the bridge, it began to sway laterally.

And ALL the literature is (or was) about bridges BOUNCING VERTICALLY.

THAT - was the problem, the engineers knew all about vertical bounce and lateral sway never crossed their minds.

Cheers,
Wol

The dark side of expertise

Posted Jan 17, 2020 7:43 UTC (Fri) by jcm (subscriber, #18262) [Link]

“The tests were designed by "evil psychologists" to send your brain down the wrong solution path, he said; once that happens, "you cannot stop it"”

Ok, someone has to say it. Speculative execution side-channel attack. Thanks. Byeee.

The dark side of expertise

Posted Jan 21, 2020 0:22 UTC (Tue) by mirabilos (subscriber, #84359) [Link]

+1, Funny

/.

The dark side of expertise

Posted Jan 18, 2020 9:52 UTC (Sat) by mtaht (guest, #11087) [Link]

This was a really great talk that made me deeply reflect on multiple times "expertise" had screwed me up.

One example was not long in following... in my talk at this conference ("how congestion control really works in the bufferbloated age", blatant plug: https://www.youtube.com/watch?v=ZeCIbCzGY6k ), I'd set up people to act as packets, and, being american, assumed that the sending queue was going to line up on the right side, not the left, and being that we were in australia, all the participants naturally wanted to line up on the left, instead, to start with...

much "packet re-ordering" (and hilarity) ensued before we straightened it out.

The dark side of expertise

Posted Jan 22, 2020 1:53 UTC (Wed) by ringerc (subscriber, #3071) [Link]

If you enjoy this, read Bruce Schneier's work like Beyond Fear. Also check out the amazing blog/book "You are not so Smart" (https://youarenotsosmart.com/) .

Questioning your perceptions and reasoning is powerful.

When the experts say "trust us"

Posted Jan 23, 2020 21:09 UTC (Thu) by Wol (subscriber, #4433) [Link]

PANIC !!!

I was aware of some studies on "expert disasters" that said that the one thing in common with nearly all of them was that there were no "outsiders" on the driving committee.

And the converse - adding a small number of people who weren't experts to the steering committee made a noticeable improvement in the committee's decision making.

Cheers,
Wol

Re: Diversity in decision-making bodies

Posted Apr 15, 2020 12:31 UTC (Wed) by Nemo_bis (subscriber, #88187) [Link]

That's one of the central arguments of James Surowiecki's The Wisdom of Crowds.
https://en.wikipedia.org/wiki/The_Wisdom_of_Crowds#Five_e...

The dark side of expertise

Posted Jan 24, 2020 18:22 UTC (Fri) by jerojasro (guest, #98169) [Link]

This reminds me of a blog post ( https://danluu.com/wat/ ) that discusses a somewhat similar phenomenon: normalization of deviance, accepting broken things and rationalizing their continued existence.


Copyright © 2020, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds