Uncertainty and Risk

Christopher Makler

Stanford University Department of Economics

 

Econ 51: Lecture 4

Uncertainty and Risk

Lecture 4

Up to now: no uncertainty about what is going to happen in the world.

In the real world: lots of uncertainty!

We'll model this by thinking about preferences over consumption lotteries in which you consume different amounts in different states of the world.

We're just going to think about preferences, not budget constraints.

Today's Agenda

Part 1: Preferences over Risk

Part 2: Market Implications

[worked example]

Lotteries

Expected Utility

Certainty Equivalence and Risk Premium

Insurance

Risky Assets

 

Option A

Option B

I flip a coin.

Heads, you get 1 extra homework point.

Tails, you get 9 extra homework points.

I give you 5 extra homework points.

 

pollev.com/chrismakler

Heads, you get 5 extra homework points.

 

Tails, you get 5 extra homework points.

 

Option A

Option B

I flip a coin.

Heads, you get 1 extra homework point.

Tails, you get 9 extra homework points.

I give you 5 extra homework points.

 

Heads, you get 5 extra homework points.

 

Tails, you get 5 extra homework points.

 

Expected Utility

Suppose your utility from points is given by

u(c) = \sqrt{c}
u(c_1) =
u(c_2) =
1
3
\mathbb{E}[c] = {1 \over 2} \times 1 + {1 \over 2} \times 9 =

Expected value of the homework points (c) is:

c_1 = 1
c_2 = 9
5
\mathbb{E}[c]
c_1
c_2
u(c_1)
u(c_2)

Expected value of your utility (u) is:

\mathbb{E}[u] = {1 \over 2} \times 1 + {1 \over 2} \times 3 =
2
\mathbb{E}[u]

Risk Aversion

u(c) = \sqrt{c}
\mathbb{E}[c] = {1 \over 2} \times 1 + {1 \over 2} \times 9 =

Expected value of the homework points (c) is:

5
\mathbb{E}[c]
c_1
c_2
u(c_1)
u(c_2)
u(\mathbb{E}[c])

Utility from expected points is:

u(\mathbb{E}[c]) =
\sqrt{5} \approx 2.25

Expected value of your utility (u) is:

\mathbb{E}[u] = 2
\mathbb{E}[u]

Because the utility from having 5 points for sure is higher than the expected utility of the lottery, this person would be risk averse.

\mathbb{E}[c]
c_1
c_2
u(c_1)
u(c_2)
u(\mathbb{E}[c])

Utility from expected points:

u(\mathbb{E}[c])=\sqrt{5}

Expected utility (u):

\mathbb{E}[u] = 2
\mathbb{E}[u]

Indifference curve for 2 utils

Indifference curve for

\(\sqrt{5}\) utils

\mathbb{E}[c]
\mathbb{E}[c]
c_1
c_2

Option A

Option B

I flip a coin.

Heads, you get 1 extra homework point.

Tails, you get 9 extra homework points.

I give you 4 extra homework points.

 

pollev.com/chrismakler

Heads, you get 4 extra homework points.

 

Tails, you get 4 extra homework points.

 

Certainty Equivalent

Suppose your utility from points is given by

u(c) = \sqrt{c}
c_1 = 1
c_2 = 9
\mathbb{E}[c]
c_1
c_2
u(c_1)
u(c_2)

Expected value of your utility (u) is:

\mathbb{E}[u] = {1 \over 2} \times 1 + {1 \over 2} \times 3 =
2
\mathbb{E}[u]

What amount \(CE\), if you had it for sure, would give you the same utility?

u(CE) =2
CE
\mathbb{E}[c]
c_1
c_2
u(c_1)
u(c_2)
\mathbb{E}[u]
CE

Indifference curve for 2 utils

CE
CE
c_1
c_2

Certainty Equivalent

\mathbb{E}[c]
c_1
c_2
u(c_1)
u(c_2)
\mathbb{E}[u]
CE

Certainty Equivalent = 4

Expected Points = 5

Risk Premium = 1

Risk Premium

How much would you be willing to pay to avoid risk?

What is Sixt offering me here?

Example 1: Betting on a Coin Toss

Example 2: Deal or No Deal

Start with $250 for sure

Would you do it?

If you bet $150 on a coin toss,
you would face the lottery:
50% chance of 100, 50% chance of 400

Two briefcases left: $200K and $1 million

Would you accept that offer? What's the highest offer you would accept?

The "banker" offers you $561,000 to walk away; or you could pick one of the cases.

A lottery is a set of outcomes,
each of which occurs with a known probability.

Lotteries

Example 1: Betting on a Coin Toss

Start with $250 for sure

If you bet $150 on a coin toss,
you would face the lottery:
50% chance of 100,
50% chance of 400

We can represent a lottery as a "bundle" in "state 1 - state 2 space"

Suppose the way you feel about money doesn't depend on the state of the world.

Independence Assumption

Payoff if don't take the bet: \(u(250)\)

Payoff if win the bet: \(u(400)\)

Payoff if lose the bet: \(u(100)\)

\text{Suppose you consume }c_1\text{ with probability }\pi_1
\text{and consume }c_2\text{ with probability }\pi_2
\text{Expected consumption: }\mathbb{E}[c] = \pi_1 c_1 + \pi_2 c_2
\text{Suppose your utility from consuming }c\text{ is }u(c)
\text{Expected utility: }\mathbb{E}[u(c_1,c_2)] = \pi_1 u(c_1) + \pi_2 u(c_2)

Expected Utility

"Von Neumann-Morgenstern Utility Function"

Probability-weighted average of a consistent within-state utility function \(u(c_s)\)

Last time: \(c_1\), \(c_2\) represented
consumption in different time periods.

This time: \(c_1\), \(c_2\) represent
consumption in different states of the world.

u(c_1,c_2) = v(c_1)+\beta v(c_2)
\mathbb E[u(c_1,c_2)] = \pi_1 u(c_1)+ \pi_2 u(c_2)

Comparison to intertemporal consumption

Marginal Rate of Substitution

\mathbb{E}[u(c_1,c_2)] = \pi u(c_1) + (1-\pi) u(c_2)
MRS =
MU_1 = \text{MU from another dollar in state 1} =
MU_2 = \text{MU from another dollar in state 2} =
\text{Expected consumption: }\mathbb{E}[c] = \pi_1 c_1 + \pi_2 c_2
\text{Expected utility (i.e., utility from taking the gamble): }\mathbb{E}[u(c)] = \pi_1 u(c_1) + \pi_2 u(c_2)
\text{Risk averse: }u(\mathbb{E}[c]) > \mathbb{E}[u(c)]
\text{Risk neutral: }u(\mathbb{E}[c]) = \mathbb{E}[u(c)]
\text{Risk loving: }u(\mathbb{E}[c]) < \mathbb{E}[u(c)]

You prefer having E[c] for sure to taking the gamble

You're indifferent between the two

You prefer taking the gamble to having E[c] for sure

\text{Utility from having }\mathbb{E}[c]\text{ for sure: }u(\mathbb{E}[c]) = u(\pi_1 c_1 + \pi_2 c_2)

Risk Aversion

\text{Risk averse: }u(\mathbb{E}[c]) > \mathbb{E}[u(c)]

You prefer having E[c] for sure to taking the gamble

\text{Risk neutral: }u(\mathbb{E}[c]) = \mathbb{E}[u(c)]
\text{Risk loving: }u(\mathbb{E}[c]) < \mathbb{E}[u(c)]

You're indifferent between the two

You prefer taking the gamble to having E[c] for sure

Certainty Equivalence

"How much money would you need to have for sure

to be just as well off as you are with your current gamble?"

Risk Premium

"How much would you be willing to pay to avoid a fair bet?"

Market Implications

How do markets allow us to shift our consumption across states of the world?

Case 1: Insurance

Case 2: Risky Assets

Money in good state

Money in bad state

35,000

25,000

Suppose you have $35,000. Life's good.

If you get into a car accident, you'd lose $10,000, leaving you with $25,000.

You might want to insure against this loss by buying a contingent contract that pays you $K in the case of a car accident.

Money in good state

Money in bad state

35,000

25,000

You want to insure against this loss by buying a contingent contract that pays you $K in the case of a car accident. Suppose this costs $P.

Now in the good state, you have $35,000 - P.

In the bad state, you have $25,000 - P + K.

35,000 - P

25,000 + K - P

Budget line

\text{price ratio = }\frac{P}{K - P}
\text{Suppose each dollar of payout costs }\gamma
\text{so buying }K\text{ units costs }P = \gamma K
\text{price ratio = }\frac{P}{K - P}
= \frac{\gamma K}{K - \gamma K}
= \frac{\gamma}{1 - \gamma}
\mathbb{E}[u(c_1,c_2)] = \pi u(c_1) + (1-\pi) u(c_2)
MRS = \frac{\partial \mathbb{E}[u(c_1,c_2)]/\partial c_1}{\partial \mathbb{E}[u(c_1,c_2)]/\partial c_2} = \frac{\pi}{1-\pi} \times \frac{u'(c_1)}{u'(c_2)}
\text{Recall:}
\text{If risk neutral: }u(c) = c
MRS = \frac{\pi}{1-\pi}
\text{price ratio = }\frac{\gamma}{1 - \gamma}
\Rightarrow \text{ a risk-neutral insurance company sets }\gamma = \pi
MRS = \text{price ratio}
\text{If price ratio = }\frac{\gamma}{1 - \gamma}\text{ and }\gamma = \pi\text{, then tangency condition:}
\Rightarrow \text{ a risk-averse person facing an actuarially fair price will fully insure}
\frac{\pi}{1-\pi} \times \frac{u'(c_1)}{u'(c_2)} = \frac{\pi}{1-\pi}
\frac{u'(c_1)}{u'(c_2)} = 1
u'(c_1) = u'(c_2)
c_1 = c_2

Next Week: Exchange

Bring two agents, each with an endowment of two goods, into the same model and have them trade.