>An event for which the timing is unpredictable may "at this time" have only a 5-percent probability of occurring during the coming month, but a 60-percent probability if the time frame is extended to one year (5 percent per month for 12 months).
If the time frame were extended to two years, would the probability be 120%?
The correct probability is:
1 - (1-.05)^12 ~= 0.46
Hard to credit a text about cognitive biases that makes elementary mistakes in probability.
Doesn't this depend on whether the event can recur?
Your math correctly assess the odds of a one time event, as one minus the odds of the event never occurring. But if 5%/month is an expected frequency of a recurring event, then after 24 months we'd expect 1.2 occurrences. Yeah?
edit: Their language is awfully sloppy, though. It's not a 60% chance in the next 12 months, it's .6 expected occurrences.
You may have knowledge that the probability falls to zero after that. This is messy intelligence not math. Assuming this is a math book is a cognitive bias you are bringing into the reading.
The quoted text specifically refers to this being the result of a 5% monthly chance recurring over 12-months, i.e. it was a pretty unambiguous appeal to mathematics.
Whether or not it is reasonable to compound such a messy probability is a whole other question, but the fact the training material could not do so correctly (on apparently its own terms) does not speak with great confidence for the practitioners trained upon it.
> The quoted text specifically refers to this being the result of a 5% monthly chance recurring over 12-months, i.e. it was a pretty unambiguous appeal to mathematics.
No it isn't. If I have some information about events that will occur over the next few months, my estimations will be different based on that knowledge, eg I know there is going to be an election in some neighboring country that will involve violence, so I think that the probability of being unable to ship overland through that country is low now, but high in a few months.
This is from 1999. Meanwhile, IARPA now conducts studies where intelligence analysts look at various *INT layers of data stacked on top of eachother and cognitive modelers try to recreate the biases they demonstrate in the lab. See:
The Neural Basis of Decision-Making During Sensemaking: Implications for Human-System Interaction
How does this compare to the authors newer book "Structured analytic techniques for intelligence analysis" (2015) (which unlike the linked book is available at my local library)
They are very different. SAT is a workbook like structure meant to be used as a on the job reference as well.
I think it's generally lacking in mathematical rigor and doesn't really do that good of a job building models. It simply gives the user some entry level tools to make their analysis more structured.
This is so cool. I'm an analyst at a tech company, I wish we had more structure used in our day to day analytical work. Too often we conduct satisfying analysis to inform decisions - the number and impact of the decisions we inform determines our success as an analyst. I would love bandwidth to approach a problem with no prior assumptions, but doing so takes too much time.
Any other resources like this that I might find interesting and impact the way I go about analytical work?
I once gifted a book I wrote about empirical measurement and stats to someone who turned out to be a CIA analyst instructor. She gave me a copy of this book in return. It was astonishing how well the ideas mapped.
Stumbled upon this subject as I was studying computer vision at a Japanese lab (under a professor who had studied Bayes' Theorem for his entire career –– yes, he had it framed behind his desk)
I do share his fascination of Bayes' and believe that it is one of the most powerful theorems out there. It keeps popping up in applications everywhere (ML, crypto, intelligence, pharma dev etc etc) since published about 200 years ago. Taught to thousands of undergrads every year in every country, I sometimes get the impression its simplicity does not successfully convey the true real-world capacity.
To think it was not so long ago assumed inferior to sampling and frequency statistics.. :)
Read this about 10 years ago when I was with the state government and it was very helpful in making real time complex excision. I can see where it would mesh very well in engineering.
If the time frame were extended to two years, would the probability be 120%?
The correct probability is:
1 - (1-.05)^12 ~= 0.46
Hard to credit a text about cognitive biases that makes elementary mistakes in probability.