โŒ

Reading view

From AI companions to climate action, weย  undervalue what lies ahead

Millions of people around the world now use AI companions โ€” for friendship, emotional support, mental health counselling and romantic interactions. This includes 72 per cent of adolescents, according to one study from the United States.

Meanwhile, human-caused climate change has already led to widespread impacts and rising risks, some of them irreversible. Yet emissions remain high.

As a professor of finance, I see these phenomena as different expressions of the same underlying bias: we apply too high a discount rate to the future.

The idea of a discount rate is straightforward. A dollar today is worth more than a dollar tomorrow. The discount rate tells us by how much. Set that rate too high, and you systematically undervalue what lies ahead. Set it too low, and you over-invest in distant outcomes.

In many parts of life, we set this rate too high. Behavioural economist David Laibson showed that people place disproportionate weight on immediate rewards, even when this leads to worse outcomes over time.

In finance, we understand that valuation depends critically on the discount rate applied to future cash flows. In life, we continue to apply a discount rate that is too high, marking down the future to the point where it no longer meaningfully constrains the present.

What feels good now

Psychologist Hal Hershfieldโ€™s research on the future self helps explain why. People often perceive their future selves more as another person than as a continuation of who they are now. This makes it easier for the self that benefits today to shift costs onto the self that must bear them tomorrow.

Looking at this through a finance lens, it resembles a โ€œprincipal-agent problem,โ€ where managers may prioritize short-term incentives over the long-term interests of shareholders.

In both cases, the person making the decision does not fully bear the long-term cost. But the future does not disappear. It simply becomes easier to ignore.

Investment in relationships

This logic becomes easier to see if we look at how we build relationships. Strong relationships require time and a willingness to tolerate discomfort.

Trust and intimacy involve immediate effort but the benefits accumulate gradually. By contrast, autonomy and flexibility offer immediate rewards. They preserve options and reduce constraints, making it easy to defer relational investment.

But relationships, like other forms of capital, depend on sustained investment, and delayed investment is often hard to recover later.

The same logic can also be seen in family structures and broader social connections. Strong ties in families, friendships and communities depend on time and repeated interaction. Without it, those ties weaken.

As those ties weaken, loneliness becomes more likely. Research shows that loneliness and social isolation are associated with significant health risks. In this sense, loneliness can be understood as the long-term consequence of insufficient investment in connection when it was easier to build.

How loneliness is killing us, according to Harvard professor Robert Waldinger.

These patterns are not only individual. They also reflect the way modern life is increasingly organized around immediacy and convenience. Technology makes interaction faster, easier and more responsive, but many of the things that matter most in the long run still require time, patience and discomfort. The result is a social environment that increasingly rewards responsiveness over endurance.

Immediate benefits

Seen in this light, AI companions are not an anomaly. They are emerging in an era of widespread loneliness, where many people are seeking connection that feels reliable and low in emotional cost.

Back in 2002, pioneering research by Clifford Nass and Youngme Moon showed that people apply social rules to computers even when they know theyโ€™re not human. Almost 25 years later, research now suggests AI can provide emotional support and a real sense of companionship in the short term. From todayโ€™s perspective, this is an efficient solution: the benefits are immediate and reliable.

The concern is not that AI companionship fails. Itโ€™s that it succeeds too well in the present. By reducing effort, uncertainty and emotional risk, AI companions make connection easier to access but may also shift expectations in ways that are harder to sustain over time in human relationships. In that sense, they reflect the same trade-off: immediate comfort at the expense of longer-term relational depth.

The same logic extends beyond individual life and helps explain how societies respond to long-term problems.

Climate change is perhaps the clearest example. The impacts of our warming planet are already very evident and yet weโ€™re slow to act. This is, in part, because the economic benefits of extraction and consumption are immediate, while many of the costs are delayed and dispersed across time.

A voiceless future

Across many human domains, from AI and personal relationships to climate change, the structure is the same: The present is immediate and rewarded; the future is abstract, distant and silent. So, decisions skew toward today.

This is not simply a matter of awareness or intention. It is structural. The future has no meaningful representation in present decision-making. It has no voice, no urgency and no direct claim. And so itโ€™s discounted.

This is what Canadian Prime Minister Mark Carney called the โ€œtragedy of the horizon.โ€ Whether in the climate crisis or the loneliness epidemic, the catastrophic impacts will be felt beyond the traditional horizons of investment cycles and political terms.

Because the future has no seat at the board table, it is treated as an externality โ€” a cost we donโ€™t have to account for today, but one that is compounding at an unsustainable rate.

Until we find ways to give the future a real stake in present decisions, we will continue to choose what is easier now and pay for it later.

The tendency to discount the future is deeply human. But in a world increasingly shaped by AI systems, weakening social ties and accelerating climate risk, the costs of doing so are becoming harder to ignore.

The Conversation

Rahul Ravi does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

  •  
โŒ