Housefull Housefull Economics

Splitting the Loot Is No Joke

Our weekly explainer on economics using lessons from popular culture. In Installment 57, The Dark Knight demonstrates the Shapley Value.

The opening scene of The Dark Knight is probably one of the most brilliant character introductions I have seen in the recent times. Not only does it set up the stage and the pace for the rest of the movie, it introduces Joker as part loon and part psychopath who does things not because they fit into a grand plan (as he helpfully clarifies later “Do I really look like a guy with a plan? You know what I am? I’m a dog chasing cars.”) but for his love of the chaos and disruption of the established order.

For aficionados of game theory there is a ton of material to mine in the movie, but I am going to focus on the bank heist shown in this opening sequence.

GRUMPY: Three of a kind. Let’s do this.

CHUCKLES: That’s it? Three guys?

GRUMPY: There’s two on the roof. Every guy is an extra share. Five shares is plenty.

CHUCKLES: Six shares. Don’t forget the guy who planned the job.

GRUMPY: Yeah? He thinks he can sit it out and still take a slice then I get why they call him the Joker.

Grumpy definitely is not thrilled to give the Joker an equal share of their bounty for what he thinks is unequal division of labor. But how should they have split the payoff?

The Shapley Value — named for Lloyd S Shapley, a Nobel prize winning economist — provides one way to do that in a ‘fair’ manner. Fairness in this context is postulated as three simple axioms.

  1. If a participant of the coalition is a dummy player i.e. he/she contributes nothing to the final payoff, he/she gets nothing.
  2. If two participants of coalition are interchangeable i.e. they contribute the same to every possible coalition, they should get the same allocation in the final payoff.
  3. If the process (or the game) that generates value can be separated into two sub-games, the way value allocated gets allocated in the final game is just the sum of allocation in the two sub-games.

Based on the above axioms, the Shapley Value provides a mathematical means to calculate the fair allocation of the final payoff. The math boils down to figuring out how much incremental value an individual participant brings to the coalition in every possible arrangement of the coalition and then averaging those values.

Let me illustrate this allocation with a simple three-player game.

Say A, B and C are three firms whose innovations don’t have much value on their own. The value each player generates solo is 0. But A-B together generate a value of 90, B-C a value of 70 and A-C a value of 80. But If all three form a coalition they generate a payoff of 120. Clearly, they should all collaborate but if they do what should they individually get from the final pay off.

In this coalition A brings more to the table than B and C (A-B and A-C generate more than B-C). Allocating the final payoff equally (40 each) is unfair to A and overly generous to C since A is the most valuable and C the least. A fair allocation should reward players on the value they bring to the coalition else the coalition may never happen in the first place.

The Shapley Value provides a way to determine that allocation.

In the table below, the first row shows one possible arrangement of the coalition. Adding A alone to the coalition generates a value of 0 since solo innovations are worthless. Adding B brings an incremental value of the coalition to 90 since A-B now have something of value. Adding C to the party increases the overall value generated only by 30 since A-B already generated the first 90 and 120 is the maximum value that the grand coalition can generate. That is the incremental value C brings in this arrangement is 30.

Similarly, in the next row, we start with A in the group with value of 0. Adding C next brings 80 (value of A-C) and then finally adding B only increases the final payoff by 40.  40 is then the marginal value of B to a coalition that already has A-C.

Running through all the permutations gives us the overall marginal value each player generates and then averaging them gives us the marginal value each player creates or their Shapley Value.

Order of Addition to the Coalition Incremental Value of Participant
A B C
A B C 0 90 30
A C B 0 40 80
B A C 90 0 30
B C A 50 0 70
C A B 80 40 0
C B A 50 70 0
TOTAL CONTRIBUTIONS 270 240 210
SHAPLEY VALUE 45 40 35

Given its ability to generate normative allocations within groups, the Shapley Value gets used in disciplines that don’t normally fit into what we think of as game theoretic use cases. Social network research – who among a network of people is most valuable, and what if any do other nodes contribute – or marketing research – attributing which marketing channel contributes the most to your revenue and how to allocate your resources to each of these channels – are just some of the examples.

An interesting and a recent use case is machine learning. Most complex machine learning models tend to be black boxes when it comes to explaining the relative importance of the underlying features (variables used in making the prediction) of the model. Computing Shapley values for features allows for contrastive explanations and helps answer questions like, ‘Is the addition of this extra data variable causing me to recommend that product vs another?’

By the way if you want to start using Shapley Values to equitably split rent or taxi fare, here is a handy tool called Spiddit that does it for you.

So, could Joker and the rest of the gang have used the Shapley Value to allocate payoffs accurately? Possibly, but again, Shapley Value is for coalition games, and there wasn’t much of the grand coalition left at the end of the scene.

About the author

Satish Terala

Satish is a technology professional based out of Boston. He has degrees from UC Berkeley and the University of Toronto in doing marginally useful things. He still hopes to finish his GCPP - some day.