This talk will overview many of the main themes of Bayesian Decision Theory. Emphasis will be placed on the use of Bayesian statistics for defining "optimal" decision making. Most of the talk will examine decision making under uncertainty when an agent receives a payoff after each individual decision. We'll start by covering the basics of Signal Detection Theory, a theory of optimal decision making that is historically important because it independently measures an agent's sensitivity (a measure of the difficulty of a task) and bias (a measure of response biases). Next, we'll cover decision making based on prior probability distributions, decision making based on posterior probability distributions, and decision making based on minimizing risk. An example of risk minimization (due to Freeman, 1996) to solve the "shape from shading" problem in visual perception will be discussed. Lastly, we'll briefly mention dynamic decision making which occurs when an agent needs to make a sequence of decisions before receiving a payoff, and there are temporal dependencies among decisions.