Decision Tree

A Decision Tree is a graphical tool used to map complex decision-making processes, showcasing different paths and their outcomes. It’s useful for handling uncertainty, risk analysis, and sequential decisions, but can be complicated or misleading if not used properly.

Definition

A Decision Tree is a flowchart-like structure that visualizes the course of action or a statistical probability algorithm. It displays an algorithm that only contains conditional control statements.

Nodes

Decision Trees are composed of decision nodes, represented by squares; chance nodes, represented by circles, accounting for uncertainty and depicting potential outcomes; and end nodes or leaf nodes, represented by triangles, illustrating the final outcome of a decision path.

Branches

Branches in a Decision Tree symbolize the potential choices available at each decision point, or the potential outcomes in the case of a chance event.

Root Node

The initial decision that instigates the tree structure is known as the root node. It represents the ultimate question or decision that is being explored.

Decision Analysis

This is the practice of making decisions using Decision Trees. It can involve many branches of mathematics, including statistics, probability, and game theory.

Sequential Decision Making

Decision Trees often represent decisions that must be made sequentially. Each decision impacts subsequent decisions and, therefore, the final outcome.

Expected Value Calculation

One common approach to deciding the best decision within a Decision Tree is through the calculation of expected values at each decision node.

Risk Analysis

Decision Trees can be used to understand and quantify risk. This is done by assigning probabilities to chance nodes and using these to calculate expected values and variances.

Sensitivity Analysis

This process involves adjusting the probabilities or payoffs in a Decision Tree to see how sensitive the final outcome is to changes in these inputs.

Utility Theory

In some complex decisions, where outcomes have different levels of satisfaction or utility, Decision Trees may incorporate utility functions to better capture the decision maker’s preferences.

Pruning

This refers to the removal of decision branches in a Decision Tree that don’t affect the final decision, simplifying the tree.