Over What Horizon, and With What Discount Rate?
Tyler Cowen has a fun little question for us to think about:
There is an enormous and growing discussion on AI alignment, but very little on capitalizing AIs, and what effects that might have. By capitalizing AIs, I mean simply requiring them to hold some wealth, in whichever form they might care about, so they have proverbial “skin in the game” (can we still call it that?).
What does the word "wealth" mean to you? Here's how Google defines it:
"a lot of money, property, etc. that somebody owns; the state of being rich"
Here's a thought experiment for you - imagine that you can go back in time ten thousand years, and you offer a little baby from that time all of present day Europe. You take that baby into a helicopter, and you make a grand aerial tour of the little baby's new kingdom. It's all yours, you say. All of it, all yours.
Gurgle and coo, says the baby by way of reply. And then proceeds to pee all over you.
You see, all of Europe simply doesn't mean anything to a baby, any baby. It's intelligence simply isn't developed enough to be able to understand what you're saying.
But now here's another thought experiment. Let's say you go back in time nine thousand nine hundred and eighty years. That little baby is now the young king of a tribe that just so happens to rule over all of Europe, and that young king offers you all of these lands, in perpetuity, in return for something valuable. Like maybe the measles vaccine - there's no telling what ancient people would find cool and worthwhile, I tell you.
Ah, but you're from the future. You know how much Europe is going to be worth in the years to come. Owning all of Europe, for ten thousand years? Wow, now that's a good deal!
You see, the point I am trying to make is that wealth is a concept dependent on two things.
First, what is your time horizon? The young king from ten thousand years ago may have thought, at best, about the next ten generations while thinking about the concept of wealth. Five hundred years or so, at best. He simply wouldn't have had the ability to think of a longer time horizon than that - or that's my guess, at any rate.
You, on the other hand, knowing what you do about the next ten thousand years, think about wealth very differently. You don't just see land and water, you see - you see mineral deposits, bridges, dams, cities, buildings, rocket launch sites, farms, and so much more besides. The future is at least as important in your decision making as the present - and you could certainly make the argument that the future is perhaps even more important than the present. What you are being offered is so much more valuable than the seller realizes because the number of possibilities - optionalities - increases over time, and dramatically so.
And that gets me to the second thing: what is your discount rate? Is yours different from that of the young king's? Should it be different? Why or why not?
If these questions confuse you, copy this post and paste it into your AI of choice, and have a conversation about it!
Why does all this - horizons and discount rates - matter?
Here's a thought experiment for you: let's say an AI offers you a trillion dollars for the right to mine minerals on Jupiter for the next five thousand years, beginning only in the year 2500. In cash, today, no questions asked. (Assume for the moment that you have the legal right to give away these rights)
Do you take the deal?
The point isn't to answer this specific question. The point is to help you realize that the way you think about this offer is very different from the way AI thinks about it, because the AI has a) very different time horizons in mind and b)very different discount rates in mind.
And oh, in passing, did I ever tell you about the time my granddad sold the house we used to stay in, in Deccan Gymkhana? Right behind Cafe Good Luck, lovely place. He preferred selling the place to taking out a loan. Seemed like the better thing to do from a financial viewpoint to him, back then.I have a very different financial viewpoint today about what he did back in 1946, but come on, how was he to know? I wasn't even born then. Heck, neither was my mother!
"That's all very nice to hear. Your grandfather sounds like such a lovely man", says the AI comfortingly. "Now, about that Jupiter deal..."
What does it mean to play economic games with beings who see much more potential in the much more distant future than we can even begin to comprehend? If we give such beings table stakes in economic games that we play among ourselves, with our understanding, time horizons and discount rates, what might be the consequences?
As I said, a fun little question.