Christopher Makler
Stanford University Department of Economics
Econ 51: Lecture 16
Possible Friday lecture:
Give you everything you need to annoy everyone at Thanksgiving dinner
Which strategies are dominated?
1
2
0
0
,
1.5
1.5
,
1.5
1.5
,
4
4
,
U
M
L
R
D
4
4
,
0
0
,
pollev.com/chrismakler
What color shirt am I wearing?
pollev.com/chrismakler
Which would you choose?
$600,000 for sure
An equal chance of $200,000 or $1,000,000
pollev.com/chrismakler
Which would you choose?
DEAL: $561,000 for sure
NO DEAL: An equal chance of $200,000 or $1,000,000
Up to now: no uncertainty about what is going to happen in the world.
In the real world: lots of uncertainty!
We'll model this by thinking about preferences over consumption lotteries in which you consume different amounts in different states of the world.
We're just going to think about preferences, not budget constraints.
[worked example]
Lotteries
Expected Utility
Certainty Equivalence and Risk Premium
Option A
Option B
I flip a coin.
Heads, you get 1 extra homework point.
Tails, you get 9 extra homework points.
I give you 5 extra homework points.
Heads, you get 5 extra homework points.
Tails, you get 5 extra homework points.
Suppose your utility from points is given by
Expected value of the homework points (c) is:
Expected value of your utility (u) is:
Expected value of the homework points (c) is:
Utility from expected points is:
Expected value of your utility (u) is:
Because the utility from having 5 points for sure is higher than the expected utility of the lottery, this person would be risk averse.
Option A
Option B
I flip a coin.
Heads, you get 1 extra homework point.
Tails, you get 9 extra homework points.
I give you 4 extra homework points.
Heads, you get 4 extra homework points.
Tails, you get 4 extra homework points.
Is your answer now different than it was before...?
Suppose your utility from points is given by
Expected value of your utility (u) is:
What amount \(CE\), if you had it for sure, would give you the same utility?
Risk Premium
How much would you be willing to pay to avoid risk?
What is Sixt offering me here?
Example 1: Betting on a Coin Toss
Example 2: Deal or No Deal
Start with $250 for sure
Would you do it?
If you bet $150 on a coin toss,
you would face the lottery:
50% chance of 100, 50% chance of 400
Two briefcases left: $200K and $1 million
Would you accept that offer? What's the highest offer you would accept?
The "banker" offers you $561,000 to walk away; or you could pick one of the cases.
A lottery is a set of outcomes,
each of which occurs with a known probability.
Suppose the way you feel about money doesn't depend on the state of the world.
Independence Assumption
Payoff if don't take the bet: \(v(250)\)
Payoff if win the bet: \(v(400)\)
Payoff if lose the bet: \(v(100)\)
"Von Neumann-Morgenstern Utility Function"
Probability-weighted average of a consistent within-state utility function \(v(c_s)\)
You prefer having E[c] for sure to taking the gamble
You're indifferent between the two
You prefer taking the gamble to having E[c] for sure
You prefer having E[c] for sure to taking the gamble
You're indifferent between the two
You prefer taking the gamble to having E[c] for sure
"How much money would you need to have for sure
to be just as well off as you are with your current gamble?"
"How much would you be willing to pay to avoid a fair bet?"