.

In my blog “Making Informed Decisions in Imperfect Situations”, I discussed the importance of properly and objectively framing the decision that we seek to make and how that impacts the data that we gather (and ignore) in an effort to make an informed decision. That is:

*Are you trying to gather data to determine the right decisions or are you gathering data to support the decision that you have already made? *

In that blog, I introduced two tools that can help us make informed decisions using the best available data, even when that data might be incomplete, conflicting, and/or distorted by others. The first tool is the Informed Decision-making Framework (special thanks to Erik Burd) and the iterative process outlined in Figure 1.

**Figure** **1****: Tool #1: Informed Decision-making Framework**

The second tool was the Decision Matrix (special thanks to Craig “Doc” Savage for sharing the Decision Matrix). We can use the Decision Matrix to help us make an informed decision by following the steps outlined in Figure 2.

**Figure** **2****: Tool #2: The MECE Decision Matrix**

**Objectivity** is everything. If you come into this process with your mind already made up about the decision that you want to prove, then you will only find data that support your position and find reason to ignore the data that runs counter to what you already believe. Many folks succumb to this **confirmation bias** – the tendency to interpret new evidence as confirmation of one's existing beliefs – and only seek data that supports the decision that they have already made.

We can use to help combat decision-making Confirmation using Bias Bayes’ Theorem and Bayesian Inference.

Understanding how to use the Bayes’ Theorem (and Bayesian Inference) to help combat confirmation bias starts with a basic understanding of **Conditional Probabilities**.

*Conditional Probability is a measure of the probability of an event occurring, given that another event has already occurred* (Figure 3).

**Figure** **3****: Conditional Probabilities**

Conditional Probabilities introduces the concept of using **prior knowledge** to make an informed decision in an environment where the data accuracy and completeness is changing. That sets up **Bayes’ Theorem,** which determines probability of the occurrence of an event, based on prior knowledge of conditions related to the occurrence of the event (Figure 4).

**Figure** **4****: What is Bayes’ Theorem**

For example, if a disease is related to age, then using Bayes' theorem, a person's age can be used to more accurately assess the probability that they have the disease, compared to the assessment of the probability of disease made without knowledge of the person's age.

In another example, we can determine the likelihood that it is going to rain given that the day started cloudy. Figure 5 shows how to frame the Bayes’ Theorem framing and the associated calculations (plus a cool website that helps with the Bayes’ Theorem calculation).

**Figure 5****: Bayes’ Theorem in Action!**

Finally, **Bayesian Inference** is a method of statistical inference in which the Bayes' theorem is used to update the probability for a hypothesis as more data becomes available. **Bayesian Inference** can be used to gradually update the probability of the occurrence of an event as more data is gathered or updated, such as what is happening today with increasing data about COVID.

We live in a world of probabilities and can use probabilities to improve the likelihood of making better decisions especially in decisions about preventative actions. But using probabilities to make decisions does not guarantee that you will always make the right decision.

**Resulting**, which was developed in the world of Poker, is a tendency to equate the quality of a decision with the quality of its outcome. In other words: If we got the result we wanted, we assume it must be because we did something right. If we didn't get the result we wanted, we assume it must be because we did something wrong. Just because you have a bad outcome, does not mean the decision was wrong based on what was known at the time of the decision.

And that’s where Bayesian Inference can help. As we get more data, we can revise our probabilities of making the right decision. Again, that doesn’t guarantee that we’ll get the desired outcome, but we can improve our changes of the desired outcome (surviving a car crash, surviving a bike accident, surviving COVID) by employing Bayesian concepts to factor updated data into our decision-making models.

The ability to update the likelihood of an event occurring is critical in situations where new data is being generated and new insights uncovered. The willingness to ingest new facts, toss out outdated facts, and ignore the droning in our ears people’s distorted version of truth is the key to survival, not only as professionals, but as a species.

Views: 867

Tags: #AI, #AutoDM, #AutoML, #BigData, #CDO, #DOBD, #DataAnalytics, #DataEconomics, #DataManagement, #DataMonetization, More…#DataScience, #DeanofBigData, #DeepLearning, #DesignThinking, #DigitalTransformation, #DigitalTwins, #Economics, #FeatureEngineering, #IIoT, #Innovation, #InsightsMonetization, #InternetOfThings, #IoT, #MachineLearning, #NeuralNetworks, #Smart, #SmartCity, #SmartSpaces, #TLADS, #ThinkLikeADataScientist, dsc_analytics, dsc_cxo

Posted 9 November 2021

© 2021 TechTarget, Inc. Powered by

Badges | Report an Issue | Privacy Policy | Terms of Service

**Most Popular Content on DSC**

To not miss this type of content in the future, subscribe to our newsletter.

- Book: Applied Stochastic Processes
- Long-range Correlations in Time Series: Modeling, Testing, Case Study
- How to Automatically Determine the Number of Clusters in your Data
- New Machine Learning Cheat Sheet | Old one
- Confidence Intervals Without Pain - With Resampling
- Advanced Machine Learning with Basic Excel
- New Perspectives on Statistical Distributions and Deep Learning
- Fascinating New Results in the Theory of Randomness
- Fast Combinatorial Feature Selection

**Other popular resources**

- Comprehensive Repository of Data Science and ML Resources
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- 100 Data Science Interview Questions and Answers
- Cheat Sheets | Curated Articles | Search | Jobs | Courses
- Post a Blog | Forum Questions | Books | Salaries | News

**Archives:** 2008-2014 |
2015-2016 |
2017-2019 |
Book 1 |
Book 2 |
More

**Most popular articles**

- Free Book and Resources for DSC Members
- New Perspectives on Statistical Distributions and Deep Learning
- Time series, Growth Modeling and Data Science Wizardy
- Statistical Concepts Explained in Simple English
- Machine Learning Concepts Explained in One Picture
- Comprehensive Repository of Data Science and ML Resources
- Advanced Machine Learning with Basic Excel
- Difference between ML, Data Science, AI, Deep Learning, and Statistics
- Selected Business Analytics, Data Science and ML articles
- How to Automatically Determine the Number of Clusters in your Data
- Fascinating New Results in the Theory of Randomness
- Hire a Data Scientist | Search DSC | Find a Job
- Post a Blog | Forum Questions

## You need to be a member of Data Science Central to add comments!

Join Data Science Central