Why Waymo Still Pays Humans to Rescue Robotaxis in Los Angeles

Self-driving cars are often presented as a future where humans are no longer needed behind the wheel. But real-world deployments tell a more complicated story.

In Los Angeles, Waymo, Alphabet’s autonomous vehicle company, is paying people $20 or more per job to physically assist stranded robotaxis. Some jobs are as simple as closing a door. Others involve towing a driverless vehicle that cannot safely move on its own.

At the same time, another form of human involvement is taking shape inside the vehicle itself — through a tightly controlled AI assistant powered by Google’s Gemini.

Together, these two layers reveal how autonomy actually works today: machines drive, AI explains, and humans step in when reality breaks expectations.

Small Human Errors Can Stop a Robotaxi Cold

Waymo’s vehicles are designed with extremely conservative safety rules. If something appears even slightly wrong, the car will not move.

One of the most common problems is surprisingly basic. A passenger exits the car but does not fully close the door. To a human driver, that would be a quick fix. To a robotaxi, it is a hard stop.

Other issues include seatbelts caught in doors, interior sensors misreading positions, low battery situations, or vehicles stopping in locations where continuing could be unsafe. In those moments, the car does exactly what it was trained to do: pause and wait.

Waymo can monitor vehicles remotely, but it cannot physically intervene. That gap is filled by people.

How Waymo Pays Humans to “Rescue” Robotaxis

To resolve these situations, Waymo relies on a network of gig workers and towing professionals, particularly in Los Angeles.

When a vehicle needs help, Waymo uses Honk, an app similar to Uber but for roadside services. Workers receive a request with the vehicle’s location and the specific task required.

Typical interventions include:

  • Closing a door that did not latch properly
  • Clearing a seatbelt or interior obstruction
  • Moving a car stopped in an unsafe position
  • Towing a robotaxi to a charging or service hub

According to reporting by The Washington Post, quick fixes often pay $20–$24, while towing jobs can reach $60–$80 or more, depending on distance and complexity.

Waymo does not rely on random passersby for these tasks. People are unpredictable, liability matters, and documentation is critical. Paid workers provide control, accountability, and traceability.

Inside the Car: Waymo’s Gemini AI Is Not a Driver

While humans are helping robotaxis from the outside, another system is quietly being prepared inside the vehicle.

Security researcher Jane Manchun Wong recently uncovered a 1,200+ line system prompt embedded deep within Waymo’s mobile app code. The document, titled “Waymo Ride Assistant Meta-Prompt,” describes an unreleased in-car AI assistant powered by Google Gemini.

The assistant runs on Gemini 2.5 Flash, a lightweight large language model designed to interact with passengers — not to control the car.

This distinction is not subtle. It is aggressively enforced.

A Strict Wall Between “Assistant” and “Driver”

One of the most important revelations from Wong’s findings is how carefully Waymo separates responsibilities:

Gemini (Ride Assistant)

Handles conversation, reassurance, and limited cabin interactions.

Waymo Driver (Autonomous System)

Controls steering, braking, routing, and navigation — completely isolated from Gemini.

The assistant cannot steer, change routes, override decisions, speculate about driving behavior, or explain why the car made a specific maneuver. The system prompt repeatedly emphasizes that Gemini must never be perceived as the driver.

This separation exists to prevent over-trust, confusion, and unsafe assumptions by passengers.

How Waymo Trains Gemini to Behave Safely

The discovered prompt uses a professional trigger–instruction–response framework commonly found in safety-critical AI systems.

Each rule defines:

  • A trigger (for example, a vague or risky passenger request)
  • An instruction that dictates behavior
  • Side-by-side incorrect and correct responses

For ambiguous requests, Gemini must:

  1. Clarify intent
  2. Draw a conclusion
  3. Safely deflect if boundaries are crossed

Hard limits are enforced through explicit prohibition lists, paired with approved alternative responses. The assistant is designed to never improvise beyond its role.

What Gemini Can — and Cannot — Do During a Ride

According to the prompt, Gemini can:

  • Answer general knowledge questions
  • Provide reassurance
  • Explain situations in neutral, non-speculative language
  • Adjust limited in-cabin features like climate settings

It is explicitly prohibited from:

  • Changing routes or destinations
  • Adjusting seats, windows, or music volume
  • Ordering food or making reservations
  • Commenting on driving decisions or incidents
  • Speculating about system failures or accidents

This restraint mirrors Waymo’s broader philosophy: keep passengers informed and calm, without creating false authority.

When AI and Autonomy Both Reach Their Limits

These layered safeguards help explain why human intervention is still essential.

When robotaxis stall due to open doors, drained batteries, or infrastructure failures, neither Gemini nor the autonomous driving system can resolve the situation alone. That is when:

  • Human workers physically intervene
  • Tow operators are dispatched via Honk
  • Engineers refine prompts and policies to reduce future confusion

Automation at scale does not remove humans. It redistributes their role.

Power Outages Expose the Fragility of Full Autonomy

This became especially visible during a major San Francisco power outage in December 2025, when traffic lights went dark and dozens of Waymo vehicles stalled simultaneously.

Although Waymo states its system can navigate intersections without signals, the scale of the outage forced vehicles into prolonged analysis loops. Operations were paused, resumed, and then paused again for updates.

The incident highlighted why Waymo is cautious about what Gemini can say or imply — and why autonomy is still bounded by real-world uncertainty.

The Bigger Picture: AI Companions, Not AI Drivers

Viewed together, the human rescue workforce and the Gemini system prompt tell the same story from two angles.

Waymo is not building a single all-knowing AI. It is building layers:

  • Autonomous systems that drive
  • AI assistants that explain and reassure
  • Humans who handle edge cases machines still cannot

This may not be the frictionless future once imagined. But it is likely closer to what actually works.

For now, the future of self-driving cars still includes someone showing up, opening a door, and quietly making the system whole again.

More From Category

More Stories Today

Leave a Reply