“Can’t wait for these to cause mayhem downtown.” This Reddit comment from a San Antonio resident captures the sentiment many feel as Waymo announced its autonomous vehicle “road trip” to the Alamo City. With fewer than 10 self-driving cars set to navigate our downtown streets and major highways, San Antonians are right to question what this means for their safety—especially given recent recalls and concerning incidents in nearby Austin.
As personal injury attorneys serving San Antonio, we believe residents deserve to know not just about the technology’s promises, but also about the very real risks these testing programs pose and what legal protections exist when corporate experiments go wrong on public roads.
The “Road Trip” That Isn’t a Vacation: Understanding Waymo’s Testing Strategy
Waymo frames its San Antonio presence as a “road trip,” conjuring images of casual exploration and discovery. But make no mistake—this is a calculated corporate testing program using our public streets as a laboratory. The company selects cities where “conditions and driving culture differ from those in its usual operating areas,” essentially acknowledging that San Antonio drivers will face unique challenges as these vehicles learn our local driving patterns.
This characterization matters legally. When companies conduct testing on public roads, they assume certain responsibilities to the community. The fact that Waymo describes this as a learning process—where “unique situations are flagged by specialists, and engineers evaluate performance”—demonstrates that these vehicles are still experimental, not fully vetted transportation solutions.
Red Flags from the Road: Recent Recalls and Safety Concerns
Just weeks before announcing its San Antonio testing, Waymo faced a significant safety crisis. In May 2025, the company recalled 1,212 of its driverless cars over faulty software that causes them to crash into chains, gates and other roadway barriers. The National Highway Traffic Safety Administration (NHTSA) stated bluntly that “a vehicle that crashes into chains, gates or other gate-like roadway barriers increases the risk of injury.”
While Waymo reported no injuries from these software failures, the timing raises serious questions:
- Why proceed with expansion into new markets immediately after a major recall?
- Have all affected vehicles been properly updated before arriving in San Antonio?
- What other software glitches might emerge as these vehicles encounter San Antonio’s unique road conditions?
For San Antonio residents, this recall history is crucial context. It demonstrates that even the most advanced autonomous vehicle systems can fail in predictable, dangerous ways.
Learning from Austin: A Cautionary Tale for San Antonio
Our neighbors in Austin have been Waymo’s testing ground since July 2014, and their experiences offer important warnings. Since launching in Austin, there have been 39 incident reports involving Waymo vehicles, according to recent reporting. These aren’t just statistics—they represent real safety concerns for real people.
One particularly troubling incident went viral on social media in April 2025. A woman claimed a Waymo got stuck and left her and her friends stranded in Austin. In a TikTok video, Becky Levin Navarro shared her “zero stars” experience in a Waymo, where she says the vehicle didn’t let them out and stopped in the middle of the road.
While Waymo disputed some details of this incident, claiming passengers can always exit by pulling the door handle twice, the psychological impact of feeling trapped in a driverless vehicle cannot be dismissed. This incident highlights several concerns:
The “Prisoner in Your Own Ride” Phenomenon
When autonomous vehicles malfunction, passengers may experience:
- Panic and distress from loss of control
- Confusion about how to safely exit the vehicle
- Fear of being stranded in dangerous locations
- Uncertainty about who to contact for help
Documentation Challenges
Unlike traditional accidents where you can speak with the other driver, autonomous vehicle incidents create unique documentation hurdles:
- No human driver to provide information
- Difficulty identifying the responsible party
- Challenges in obtaining incident reports
- Corporate entities that may downplay or dispute passenger experiences
Community Voices Matter: San Antonio’s Right to Be Heard
After Waymo announced its testing fleet in San Antonio, locals shared their thoughts in a Reddit thread, with reactions ranging from skepticism to outright opposition. One user poignantly wrote, “Please no….I just want some trains man.”
This sentiment reflects a broader concern: Are autonomous vehicles being pushed on communities that would prefer investment in proven public transportation infrastructure? As personal injury attorneys, we’ve seen how corporate interests can override community preferences, often with dangerous consequences.
Your Voice in the Process
San Antonio residents have the right to:
- Attend city council meetings discussing autonomous vehicle regulations
- Submit public comments about safety concerns
- Request information about testing locations and schedules
- Demand transparency about incident reporting procedures
- Advocate for strict safety standards before commercial deployment
When “Testing” Becomes Dangerous: Protecting Yourself on San Antonio Streets
As Waymo’s vehicles begin appearing downtown and on I-10, I-35, and I-37, San Antonio drivers, pedestrians, and cyclists need practical strategies for protecting themselves:
Defensive Driving Around Autonomous Vehicles
- Maintain Extra Distance: Autonomous vehicles may brake or accelerate differently than human drivers expect
- Avoid Sudden Movements: These vehicles rely on predictive algorithms that may not anticipate erratic human behavior
- Use Clear Signals: Ensure your turn signals, brake lights, and hand signals are clearly visible
- Document Everything: If you witness unusual behavior from an autonomous vehicle, safely record it if possible
If You’re Involved in an Incident
- Prioritize Safety: Move to a safe location if possible
- Call 911: Report the incident immediately, specifying that an autonomous vehicle is involved
- Photograph Everything: Include the vehicle’s identifying markers, license plates, and any visible sensor equipment
- Gather Witness Information: Bystanders may have observed behavior you missed
- Preserve Electronic Evidence: Your smartphone’s location data and any dashcam footage become crucial
- Seek Medical Attention: Even minor collisions should be medically documented
- Contact Legal Counsel: Autonomous vehicle cases require specialized expertise
The Human Cost of Corporate Innovation
While Waymo touts potential safety improvements, we must remember that testing phases inherently involve risk. Professor Dessouky acknowledged in the previous coverage that we should “expect some glitching in the beginning” and “unusual reactions from these vehicles.”
But who bears the cost of these “glitches”? Too often, it’s innocent San Antonio residents who become unwitting participants in corporate experiments. Consider:
- The parent whose child is struck by a vehicle making an “unusual reaction”
- The elderly driver confused by unpredictable autonomous vehicle behavior
- The cyclist who assumes the vehicle “sees” them but doesn’t
- The family traumatized by a malfunction that causes a serious accident
These aren’t acceptable casualties of progress—they’re preventable tragedies when proper safeguards aren’t in place.
Corporate Accountability in the Age of Algorithms
Waymo’s parent company, Alphabet (Google), has vast resources and sophisticated legal teams. When incidents occur, victims often face:
David vs. Goliath Dynamics
- Corporate lawyers who minimize liability
- Technical jargon used to confuse and deflect
- Pressure to accept quick, inadequate settlements
- Non-disclosure agreements that silence victims
- Blame-shifting to human drivers or “user error”
The “Black Box” Problem
Autonomous vehicles operate using proprietary algorithms that companies guard as trade secrets. This creates significant challenges:
- Difficulty proving fault without access to code
- Corporate claims of “proprietary information” blocking discovery
- Technical complexity that overwhelms traditional legal processes
- Need for expert witnesses to interpret data
Lessons from Other Cities: A Pattern of Problems
During the June 2025 Los Angeles protests, five Waymo vehicles were targeted and set on fire. While we don’t condone violence or property destruction, this extreme community reaction signals deep-seated concerns about autonomous vehicles being imposed on communities without adequate consultation or consent.
In San Francisco, a Waymo robotaxi killed a dog while in “autonomous mode” in May 2023. This tragic incident raises questions about how these vehicles detect and respond to animals, children, and other vulnerable road users who may behave unpredictably.
Your Rights When Corporate Testing Goes Wrong
As Waymo begins its San Antonio operations, residents should understand their legal rights:
Right to Full Compensation
If injured by an autonomous vehicle, you’re entitled to compensation for:
- Medical expenses (current and future)
- Lost wages and earning capacity
- Pain and suffering
- Property damage
- Psychological trauma from the incident
Right to Transparency
Despite corporate secrecy, legal discovery can compel disclosure of:
- Vehicle data logs
- Testing protocols
- Known safety issues
- Previous incidents
- Internal communications about risks
Right to Refuse Inadequate Settlements
Initial settlement offers from corporate insurers rarely reflect true damages. You have the right to:
- Consult with attorneys before accepting any offer
- Pursue litigation if necessary
- Join with other affected parties in collective action
- Seek punitive damages for egregious corporate negligence
The Regulatory Vacuum: Why San Antonio Needs Stronger Protections
While Waymo says the company is committed to working with communities and public officials in the cities it enters, commitments without enforcement mechanisms offer little protection. San Antonio needs:
Comprehensive Safety Regulations
- Mandatory incident reporting within 24 hours
- Public database of all autonomous vehicle incidents
- Regular safety audits by independent experts
- Clear liability frameworks before commercial deployment
- Insurance requirements that protect victims, not corporations
Community Oversight
- Citizen advisory board with real authority
- Regular public hearings on testing progress
- Transparent communication about testing locations and times
- Mechanism for residents to report concerns
- Sunset clauses requiring re-approval for continued testing
Moving Forward: Protecting San Antonio’s Future
As Waymo’s vehicles begin navigating our streets, San Antonio stands at a crossroads. We can either accept corporate testing programs that treat our citizens as guinea pigs, or we can demand meaningful protections and accountability.
The technology industry often operates on a “move fast and break things” philosophy. But when the “things” being broken are people’s lives, bodies, and sense of safety, we must insist on a different approach.
Take Action Today
If you encounter Waymo vehicles during their San Antonio testing:
- Stay Alert: These vehicles are still learning our roads
- Document Concerns: Report dangerous behavior to city officials
- Know Your Rights: You’re not obligated to participate in corporate experiments
- Seek Help: If injured or traumatized, legal support is available
Remember, technological progress shouldn’t come at the expense of public safety. As one Reddit user noted, many San Antonians would prefer investment in proven public transit over experimental autonomous vehicles. Your voice matters in shaping our city’s transportation future.
Conclusion: Your Safety, Your Rights, Your City
Waymo’s “road trip” to San Antonio represents more than just new technology—it’s a test of our community’s willingness to prioritize citizen safety over corporate innovation. With 39 incidents already reported in Austin and a recent recall affecting over 1,200 vehicles, we have every reason to approach this technology with caution.
As personal injury attorneys, we’ve seen how quickly corporate negligence can shatter lives. We’ve also seen how prepared legal advocacy can hold even the largest companies accountable. As Waymo’s testing begins, we stand ready to protect San Antonio residents who find themselves harmed by this grand experiment.
The future of transportation may indeed be autonomous, but the path to that future shouldn’t be paved with preventable injuries to San Antonio families. Until these vehicles prove themselves truly safe—not just “safer than human drivers” in carefully controlled statistics—our community deserves robust protections and vigorous legal advocacy.
If you or a loved one has been affected by an autonomous vehicle incident in San Antonio, contact Ryan Orsatti Law immediately. We’re committed to ensuring that corporate innovation doesn’t come at the cost of community safety. Your story matters, and your rights deserve protection.