Smartphone app screen glowing over dark urban street, autonomous robot delivering goods below, invisible data extraction

Pokémon Go turned ten years old. In 2016, it was a global phenomenon: people walking through streets, parks, and parking lots, phones raised like augmented-reality detectors pointed at the world. What nobody understood at the time: it was also a spatial data harvesting engine. Thirty billion images later, those scans are training the robots that deliver your pizza.

Key Points

  • Since 2016, Pokémon Go players have contributed to building a database of 30 billion geolocated images for Niantic, without receiving monetary compensation for the data they generated.
  • In 2025, Niantic spun off its AI division into Niantic Spatial, which licenses Visual Positioning System (VPS) technology to robotics companies operating in GPS-unreliable urban environments.
  • Coco Robotics uses VPS derived from Pokémon Go data to navigate city streets where GPS signal degrades, enabling its autonomous delivery robots to operate with centimeter-level precision.
  • Player participation in AR scanning was technically voluntary and covered by the Terms of Service, but those terms were accepted years before Niantic Spatial or the commercial robotics market existed.
  • The gap between the moment of data collection and the moment of its commercial application is so wide that the connection between user action and corporate product is effectively invisible to anyone not actively looking for it.

Niantic CEO John Hanke described the connection without irony: "Getting Pikachu to realistically run around and getting Coco's robot to safely and accurately move through the world is actually the same problem." That sentence is the entire story. The game and the robot share the same perceptual infrastructure. The players built both.

The Game Was the Machine

Pokémon Go was built around augmented reality. To place a virtual Pokémon in the physical world, the app needs to understand where the camera sits in space. That requires a precise spatial map. Players provided that map every time they opened the app, simply by moving through the world with their phones.

Two specific features accelerated the collection at scale. The first was AR Scanning, introduced around 2020: players at level 20 or above could scan physical PokéStops (statues, storefronts, monuments) from multiple angles in exchange for in-game XP. The feature explicitly asked players to walk around a subject and record video from different perspectives. The second was periodic Field Research tasks: missions that asked players to photograph specific categories of physical objects to earn in-game rewards.

The result was something that no traditional data collection programme could have produced: images of every corner, in every lighting condition, in every season, from millions of people in every city on earth. The diversity and density of the dataset is its core value. A professional mapping team could spend decades and not replicate it.

Players received rare Pokémon and cosmetic items. Niantic received the spatial foundation for a robotics business.

Enter Niantic Spatial

In 2025, Niantic spun off its AI division into an independent company: Niantic Spatial. The core product is the Visual Positioning System, a technology that uses visual recognition to place objects in physical space with centimeter-level precision. The system works even where GPS degrades: inside buildings, on streets hemmed in by skyscrapers, under bridges, in the dense urban canyons where delivery infrastructure actually operates.

VPS is not a consumer product. It is B2B infrastructure. Niantic licenses it to companies that need reliable spatial positioning: delivery robots, indoor navigation systems, industrial AR platforms.

Coco Robotics is the most visible customer. Their robot, a compact autonomous vehicle that travels on sidewalks delivering food, uses VPS to orient itself in urban environments where GPS alone is insufficient. The robot does not guess at its location. It recognises the world around it, matching what its cameras see against a database built from billions of Pokémon Go scans.

The value chain

Players contributed images. Niantic trained the VPS. Niantic Spatial licensed the technology. Coco Robotics built delivery robots. Restaurants and retailers pay for last-mile delivery. At every stage except the first, someone gets paid.

What the Terms of Service Said

Niantic's response to criticism has been consistent and technically accurate: AR scanning participation was voluntary. Players had to reach level 20, opt into the feature explicitly, and images were anonymised before processing. The Terms of Service authorised the use of contributed data. Everything was disclosed.

All of that is true. And there is still a significant distance between "technically permitted" and "reasonably anticipated."

The players who opened Pokémon Go in 2016 were choosing to catch Pikachu in the park. They were not choosing to contribute to the perceptual infrastructure of a billion-dollar robotics industry that would emerge nine years later. Niantic Spatial did not exist. Coco Robotics did not exist. The market for commercially deployed autonomous delivery robots was an academic concept. The ToS they agreed to covered a future that was not described to them, because it had not yet been imagined by anyone.

This is the fundamental asymmetry between platform and user: the platform accumulates optionality over time, using data whose future value cannot be assessed at the moment of collection. The user makes a decision based on a present that has already become history by the time the data is deployed. Consent is real. Understanding of that consent is not.

No contract clause closes this gap. The distance between collection and application is the mechanism, not an accident.

The CAPTCHA Precedent

Pokémon Go is not the first case. It is the most visible, because it has a recognisable brand, characters people are attached to, and a user base in the hundreds of millions. The underlying pattern is older and more systematic.

Google's reCAPTCHA followed identical logic for years. The "click on all the traffic lights" tests were not only anti-bot verification. They were labelling tasks for Google's vision systems. Users transcribed digitised book text for Google Books, identified street numbers from Street View for Maps, and classified objects in photographs for computer vision training. No compensation. Frequently no awareness.

Waze built its real-time road condition database on user navigation data. Duolingo uses student corrections to improve its language models. Shazam used audio clips from users to train its recognition systems. The pattern is consistent across products and industries: the app is the collection mechanism, the user is the uncompensated worker, the product is the AI model, and the buyer is another company.

What distinguishes the Niantic case is the scale of the secondary market. The Coco Robotics integration is not an internal product improvement. It is a licensed commercial product built on a dataset that Niantic's users generated. The gap between user contribution and corporate revenue has rarely been this legible.

An Army of What, Exactly?

Hanke's framing, "getting the robot to safely move through the world," is deliberately benign. Delivering food is benign. But it is worth examining what the underlying technology actually is, separate from its current application.

VPS is a system for centimeter-precise spatial positioning in real-world environments. It works by matching visual input against a database of geolocated images. That description applies equally to delivery robots, to autonomous vehicles, to industrial robots operating in public spaces, to drone navigation, and to systems designed to locate and track specific objects or people within environments they have already been mapped against.

The Coco Robotics application is genuinely low-stakes. The infrastructure supporting it is dual-use. Proton's analysis of the Waze precedent is instructive: navigation data that users generated to avoid traffic was used by law enforcement to reconstruct movement patterns. Anonymisation promises proved more permeable than claimed. A VPS database trained on the neighbourhoods where millions of people live, built from the inside by people who traverse those spaces daily, represents a spatial model of the physical world with capabilities that extend well beyond pizza delivery.

This is not a warning that Niantic will misuse its data. It is an observation about what the infrastructure is, and who owns it.

What Compensation Would Look Like

The debate about data compensation is not new, and it is not purely theoretical. Three frameworks have been seriously proposed.

The first is data dividends: a percentage of revenue derived from data contributed by users is redistributed to the people who generated it. California explored this at the legislative level. The concept is straightforward; the implementation is genuinely difficult, because attributing commercial value to individual data points within a dataset of billions is not a solved problem.

The second is opt-in with economic equity: the AR scanning feature could have been structured as paid work rather than a game mechanic with virtual rewards. The collection task was real labour. The compensation was cosmetic items. Reframing that exchange as employment rather than gameplay would have changed both the legal relationship and the player's understanding of what they were doing.

The third is transparency at point of collection: if the scanning feature had stated plainly that contributed images might be used to train commercial AI systems licensed to third parties, the consent dynamic would have been different. Not necessarily worse for Niantic, since many players might have contributed regardless, but honest about the nature of the exchange.

None of these frameworks have been implemented at meaningful scale by any major platform. All three have been discussed in policy settings for the better part of a decade. The gap between proposal and implementation is itself a data point about where industry incentives actually sit.

The Pokémon Go case also connects directly to the broader debate on delivery robotics and labor displacement: the same robots being navigated by Niantic's VPS are the robots competing with the human gig workers who currently deliver that food. The data that eliminated the need for a skilled mapping team is now training the system that may eliminate the delivery job itself.

Pokémon Go is still active. Niantic Spatial is signing new partners. Coco Robotics is expanding its operating territory. The dataset continues to grow with every scan, every AR task, every player who points their phone at a PokéStop on a street that has not yet been mapped by a Google vehicle. The game continues. So does the factory. The players just do not know which shift they are working.