Author: Nancy, PANews
In the crypto world, tokens are certificates of digital rights, carrying various programmable rights and functions. As big data sweeps the globe, a new type of token is becoming popular. As the smallest unit of computation in big data, it is forming a new economic narrative and is even quietly becoming an invisible arena in overseas workplaces.

In business logic, computing power equals revenue, and tokens have risen to become the new currency of the AI world.
Today, overseas tech companies such as Meta, OpenAI, Amazon, Google, and Microsoft are incorporating token usage into their performance evaluation systems. Some companies have even established internal leaderboards to visually display the amount of tokens consumed by each team or individual, with those who underutilize tokens often being labeled as having low productivity or being out of sync with the company's culture.
The token, as a unit of measurement, is being pushed to the forefront of the global AI race. Whether it's the scale of product consumption or the frequency of team calls, Toekn has evolved into the most core and intuitive value indicator for AI products. The higher the call volume, the more frequently the model is used, and the greater the economic leverage it generates.
To incentivize employees to use AI, major companies are offering token incentives. For example, Alibaba plans to provide employees with token quotas; Tencent offers employees a token allocation worth up to 220,000 yuan annually; and Nvidia even plans to provide engineers with a token budget equivalent to about half of their base salary.
Tokens are also being incorporated into compensation models. The engineering lead at Codex, OpenAI's AI coding service, revealed that more and more job seekers are no longer just concerned with salary during interviews, but rather with "how much inference computing resources they can get." Renowned venture capitalist Tomasz Tunguz pointed out that tech companies have begun to treat inference costs as a fourth type of compensation for engineers, in addition to base salary, bonuses, and stock options. Based on his current inference expenditures, tokens will account for approximately one-fifth of his compensation. At GTC 2026, Jensen Huang also explicitly stated that AI tokens will become an important component of engineer compensation, on par with wages, bonuses, and equity. Sam Altman even envisioned that in the future, "universal basic computing power" might even replace "universal basic income."
Some have suggested that AI companies like OpenAI and Anthropic should create dedicated recruitment websites that list token budgets next to salary ranges.
Even investors have started making payments directly using tokens.
Recently, ZhenFund and Crossing jointly launched a Token Grant program, providing selected AI startups with a grant of 50,000 tokens. For AI startups, tokens are often more effective in addressing immediate needs than cash on hand.
In the AI era, a new narrative of the token economy is taking shape.
An expensive new game of identity is unfolding.
Recently, Kevin Roose, a technology columnist for The New York Times, pointed out that Tokenmaxxing is sweeping Silicon Valley, becoming a new type of data manipulation game for engineers.
He shared that an OpenAI engineer processed 210 billion tokens in the past week, equivalent to 33 Wikipedia text entries—the most of any employee; at Anthropic, a user of the company's AI coding system, Claude Code, spent over $150,000 in a month. In this token-maxing game, programmers are desperately trying to prove their efficiency and capabilities.
Originally conceived as a tool to measure productivity, tokens now appear to be evolving into a productivity showcase. This computational race has been amplified, especially with the emergence of agentic tools like Claude Code and OpenClaw. Engineers can use AI sub-agents to operate continuously, processing different tasks overnight, thus significantly increasing token consumption.
The emergence of a large number of agents has led to an order-of-magnitude leap in token consumption, allowing AI giants to reap huge profits.
For example, Anthropic's annualized revenue (ARR) recently surpassed $19 billion, nearly tripling since the end of last year; OpenAI's official data revealed that Codex's weekly active users have exceeded 2 million, with user numbers and usage increasing by 3 times and 5 times respectively since the beginning of this year; and OpenClaw's token consumption has skyrocketed, with 13.7 trillion tokens consumed in the past month alone.
In essence, tokens are tools for companies to incentivize employees to deploy AI agents to improve productivity by providing computing resources; they represent an investment in employee capabilities. However, when tokens become KPIs or even symbols of competence, does spending more necessarily lead to better work performance? Not necessarily.
Regarding the introduction of token-based compensation, former venture capitalist Jamaal Glenn believes that the productivity theory of "more tokens equals more efficiency and more money" only holds true when the interests of employees and employers are perfectly aligned, a condition most employees do not meet. Tokens, seemingly a benefit, may actually be a means of packaging compensation; they are completely different from cash or equity and will not reflect value in future job negotiations. He suggests asking about token budgets during interviews, like asking about hardware specifications or development tools, but never allowing anyone to include it in the offer as part of compensation.
If you make AI perform meaningless repetitive tasks or over-refactor perfect projects, it will not only produce no results, but will also mask the true work efficiency.
Gartner, a leading global research firm, also poured cold water on this trend. The firm stated that while token consumption is increasingly being seen by AI companies as a signal of AI scale, adoption, and market leadership, rapidly increasing token consumption does not guarantee long-term viability. The number of tokens is not structurally suitable for assessing AI success and may even mislead decision-makers within organizations.
Gartner points out that what truly determines long-term viability are monetization principles, profit margin sustainability, and enterprise penetration. Leaders responsible for AI should de-emphasize token metrics and instead evaluate AI vendors based on solution capabilities, decision-making empowerment, cost predictability, and quantifiable business results.
In addition, as tokens become hard currency, demand explodes and costs rise simultaneously.
Taking data from OpenRouter, the world's largest AI model API aggregation platform, as an example, the weekly call volume of large models in China reached 4.69 trillion tokens last week, surpassing the United States for two consecutive weeks.
JPMorgan Chase predicts that the compound annual growth rate of token consumption in China will reach 330% between 2025 and 2030, an increase of approximately 370 times in five years. IDC predicts that by 2030, the number of active AI agents globally will reach 2.216 billion, and annual token consumption will surge from 0.0005 Peta Tokens (1 Peta = 1000 trillion) in 2025 to 152,000 Peta Tokens, an increase of over 300 million times.
As deployment volume moves from experimental to large-scale application, cost pressures are forcing the industry to adjust prices to varying degrees. Overseas giants such as Amazon and Google have raised prices, and even inexpensive and high-quality domestic large-scale models are struggling to cope with the surge in deployment volume, with vendors such as Alibaba Cloud, Tencent Cloud, and Zhipu successively raising their prices.
As usage continues to increase, once major model vendors stop subsidizing prices, many startups and workflows that rely on massive amounts of tokens will face an extremely severe cost crisis.
This "token inflation game" will continue for a while, and once the tide goes out, we'll see which engineer was swimming naked.


