Incomplete Information and Risk Aversion

Christopher Makler

Stanford University Department of Economics

 

Econ 51: Lecture 7

Games of
Incomplete Information

A

B

X

Y

1

2

2

0

2

0

0

4

0

4

A

B

X

Y

1

2

0

4

2

0

2

0

0

4

Suppose one of these
two games is being played.

Both players know there is an equal probability of each game.

Only player 1 knows which game is being played right now.

What is player 1's strategy space?
Player 2's?

Nature

Heads

(1/2)

Tails

(1/2)

Both players know there is an equal probability of each game.

Only player 1 knows which game is being played right now.

We can model this "as if" there is a nonstrategic player called Nature who moves first, flipping a coin, and picks which game is being played based on the coin flip.

2

1

2

1

X

Y

2

0

2

0

0

4

0

4

X

Y

0

4

2

0

2

0

0

4

X

Y

X

Y

\(A^H\)

\(B^H\)

\(A^T\)

\(B^T\)

Nature

Heads

(1/2)

Tails

(1/2)

2

1

2

1

\(A^H\)

\(B^H\)

X

Y

1

2

2

0

2

0

0

4

0

4

\(A^T\)

\(B^T\)

X

Y

0

4

2

0

2

0

0

4

X

Y

X

Y

The Bayesian Normal Form representation of the game shows the expected payoffs for each of the strategies the players could play:

\(A^HA^T\)

\(A^HB^T\)

\(B^HA^T\)

\(B^HB^T\)

X

Y

But...can we just use the average payoff?

Risk Aversion

Up to now: no uncertainty about what is going to happen in the world.

In the real world: lots of uncertainty!

We'll model preferences over risk by thinking about preferences over consumption lotteries in which you consume different amounts in different states of the world.

Example 1: Betting on a Coin Toss

Example 2: Deal or No Deal

Start with $250 for sure

Would you do it?

If you bet $150 on a coin toss,
you would face the lottery:
50% chance of 100, 50% chance of 400

Two briefcases left: $200K and $1 million

Would you accept that offer? What's the highest offer you would accept?

The "banker" offers you $561,000 to walk away; or you could pick one of the cases.

A lottery is a set of outcomes,
each of which occurs with a known probability.

Lotteries

pollev.com/chrismakler

Suppose I were to offer you a choice:

 

5 extra points on the midterm that just occurred for sure

 

a 50/50 chance of 1 extra point,
or 9 extra points

 

Which would you vote for?

Example 1: Betting on a Coin Toss

Start with $250 for sure

If you bet $150 on a coin toss,
you would face the lottery:
50% chance of 100,
50% chance of 400

We can represent a lottery as a "bundle" in "state 1 - state 2 space"

Suppose the way you feel about money doesn't depend on the state of the world.

Independence Assumption

Payoff if don't take the bet: \(u(250)\)

Payoff if win the bet: \(u(400)\)

Payoff if lose the bet: \(u(100)\)

\text{Suppose you consume }c_1\text{ with probability }\pi_1
\text{and consume }c_2\text{ with probability }\pi_2
\text{Expected consumption: }\mathbb{E}[c] = \pi_1 c_1 + \pi_2 c_2
\text{Suppose your utility from consuming }c\text{ is }u(c)
\text{Expected utility: }\mathbb{E}[u(c_1,c_2)] = \pi_1 u(c_1) + \pi_2 u(c_2)

Expected Utility

"Von Neumann-Morgenstern Utility Function"

Probability-weighted average of a consistent within-state utility function \(u(c_s)\)

\text{Expected consumption: }\mathbb{E}[c] = \pi_1 c_1 + \pi_2 c_2
\text{Expected utility (i.e., utility from taking the gamble): }\mathbb{E}[u(c)] = \pi_1 u(c_1) + \pi_2 u(c_2)
\text{Risk averse: }u(\mathbb{E}[c]) > \mathbb{E}[u(c)]
\text{Risk neutral: }u(\mathbb{E}[c]) = \mathbb{E}[u(c)]
\text{Risk loving: }u(\mathbb{E}[c]) < \mathbb{E}[u(c)]

You prefer having E[c] for sure to taking the gamble

You're indifferent between the two

You prefer taking the gamble to having E[c] for sure

\text{Utility from having }\mathbb{E}[c]\text{ for sure: }u(\mathbb{E}[c]) = u(\pi_1 c_1 + \pi_2 c_2)

Risk Aversion

\text{Risk averse: }u(\mathbb{E}[c]) > \mathbb{E}[u(c)]

You prefer having E[c] for sure to taking the gamble

\text{Risk neutral: }u(\mathbb{E}[c]) = \mathbb{E}[u(c)]
\text{Risk loving: }u(\mathbb{E}[c]) < \mathbb{E}[u(c)]

You're indifferent between the two

You prefer taking the gamble to having E[c] for sure

Certainty Equivalence

"How much money would you need to have for sure

to be just as well off as you are with your current gamble?"

Risk Premium

"How much would you be willing to pay to avoid a fair bet?"

Nature

Heads

(1/2)

Tails

(1/2)

2

1

2

1

\(A^H\)

\(B^H\)

X

Y

1

2

2

0

2

0

0

4

0

4

\(A^T\)

\(B^T\)

X

Y

0

4

2

0

2

0

0

4

X

Y

X

Y

How do we combine risk with this analysis?

Remember that the "payoffs" we list are utility ...so maybe the monetary payoffs are these values squared...

\(A^HA^T\)

\(A^HB^T\)

\(B^HA^T\)

\(B^HB^T\)

X

Y

Mitigating Risk

(a brief foray into finance)

Econ 51 | Spring 23 | 7 | Incomplete Information and Risk Aversion

By Chris Makler

Econ 51 | Spring 23 | 7 | Incomplete Information and Risk Aversion

Uncertainty and Risk Aversion - Presentation

  • 330