Your Customers Will Blame You When Their Shopping Bots Go Rogue

By Steve Cocheo, Senior Executive Editor at The Financial Brand

Published on February 9th, 2026 in Payments

Simple Subscribe

Subscribe Now!

Stay on top of all the latest news and trends in the banking industry.

Consent Granted*

Agentic commerce, along with agentic payments, is one of the hottest areas in AI these days. This technology, as used and as envisioned, ranges from GenAI digital shopping assistants to more-sophisticated applications that can turn AI agents loose to shop under human instruction and even complete transactions autonomously.

Reality check: How smoothly will the transition to agentic commerce go for merchants, consumers and banks?

A white paper from the Consumer Bankers Association and the Davis Wright Tremaine LLP law firm suggests that all three could face serious risks if agentic commerce and, especially, agentic payments, proceed without safeguards that have come to be taken for granted in traditional payment channels.

Who pays when things go wrong? “The general rule in the Electronic Fund Transfer Act that limits consumers’ liability for unauthorized transactions may not apply when agents are involved. … Consumers may be liable for mistakes their agents make and these mistakes could be costly.”

The key risk for banks: Agentic commerce disputes will wind up in their laps.

“Banks cannot be slow in understanding how their customers will be using AI and agentic payments tools because banks should expect customers to reach out for help when agentic transactions go wrong,” according to the white paper. “Today, banks are the primary touch point for customers when there is a dispute or issue involving a payment they have made.”

The bottom line: “Customers expect banks to make them whole for failed improper payment transactions.”

Need to Know:

  • Consumer interest in agentic commerce is building. In the 2025 holiday season, shopping using GenAI-powered chat services and browsers rose nearly 700% over 2024 levels, according to Adobe.
  • “Consumers are ready to delegate meaningful purchasing and financial tasks to agentic AI. However, broad adoption will be constrained and shaped by ‘payments-grade’ trust,” says a January report by PYMNTS.com.
  • The same report says that consumers aren’t seeking novelty but see agentic shopping as key to reducing friction.
  • Banks and other payments providers can’t ignore the risks, as technologies like OpenClaw — “the AI that does things” — threaten to bypass controls that currently exist.

Understanding the Agentic Commerce Landscape

Major payments players including Mastercard, Visa, PayPal and Google are pushing agentic commerce and payments, with announcements coming steadily from these and other players.

The new CBA white paper, “Agentic AI Payments: Navigating Consumer Protection, Innovation, and Regulatory Frameworks,” distills discussions at an association symposium last fall. Attendees included banks, tech firms, merchants, payment networks, policymakers and consumer advocates.

The paper defines agentic commerce narrowly. It considers agentic to mean tools that orchestrate transactions without direct human involvement and execute transactions autonomously.

How it works: True agentic AI would be given a command like this: “Find and buy the best laptop under $500 for a 13-year-old.”

Widespread adoption of this kind of digital commerce depends in part on consumers’ faith that they would remain protected by longstanding rules applicable to credit and debit cards. These rules are based in both federal law as well as the policies of traditional payment networks. None of those protections were created with agentic commerce and payments in mind.

The critical issue: The potential new rails for agentic payments, such as crypto, lack ready mechanisms for refunds, chargebacks and other needs that are an essential part of consumers’ use of digital channels. In fact, the paper suggests that the development of new payment rails — including consumer use of stablecoins — may actually be hastened by agentic payments.

Consumers, merchants and banks will find themselves in uncharted — and unregulated — territory, according to the paper.

“Immediate government action appears unlikely in the short term. The Trump administration has given clear signals that policymakers should allow AI and agentic tools to flourish.” Thus far, the report adds, Trump regulators have been hands-off and “have allowed consumers to test these products before imposing significant regulations on them.”

Read more: How the Marriage of Open Banking and Payments Will Change Everything

-- Article continued below --

Mapping the Risks to Consumers

The white paper digs into risks that agentic commerce could pose to consumers:

Agents fail to act in consumers’ best interest. This risk can take multiple forms. One is an agent favoring a particular merchant’s products over others or preferring one set of payment rails over another — potentially because it means more revenue for the agent’s developer. That’s even if other providers’ products may be a better deal.

Agents make mistakes with consumers’ money and credit. Combinations of bad instructions and insufficient training data could result in bad purchase decisions. Symposium attendees noted that current agent technology lacks the ability to know when it is missing critical context. Example: Buying frozen food at volume discount for someone with a tiny freezer.

The “Tickle Me Elmo” Trap. One risk hypothesized by the paper is that large numbers of agents could zero in on the same product, pricing or merchant. This could reverberate in availability or pricing — and potentially flood merchants’ systems.

Agents create opportunities for breaches of consumer data. To function as envisioned, powerful agents will require access to many types of consumer data, including payment and credit history, account balances, health and insurance information, past purchase and browsing records, and more.

A related risk. Laws such as the Gramm-Leach-Bliley Act and state privacy laws provide some data protection — “but many agentic payment applications may be created by non-financial service companies not covered by these regimes.”

Agents who are malicious. “Criminals could create fake agents that purport to be legitimate agentic payment tools but that actually steal financial information or execute unauthorized transactions.”

Agents could overreach. Consumer liability for unauthorized electronic funds transfer are typically capped. However, the paper points out a key exception: Consumers are not covered by the caps “if the consumer gave their access device to another person and that person exceeded the scope of their authority.”

When is an AI agent a “person”? That’s hazy right now.

Read more: Are American Consumers Ready to Let AI Agents Shop and Pay on Their Behalf?

The Key Agentic Commerce Risks for Banks

Many of the issues affecting consumers, as well as various liabilities impacting merchants, would have ripple effects on banks. The paper suggested that the risk of agents exceeding their instructions or otherwise generating disputed transactions, absent application of the exception mentioned, could increase the number of disputes banks must handle, and, potentially, their need to reimburse customers.

“Agents make purchasing decisions without human review, leading to more transactions susceptible to merchant fraud or simply merchants unable to fulfill large influxes of unexpected orders,” according to the white paper.

There’s also a risk that agents will default to using banks’ dispute and chargeback systems in lieu of using merchants’ customer service functions.

The white paper highlighted additional risks that agentic commerce could pose:

Compliance-related exposure. For example, an agent designed to consider creditworthiness could be influenced by training data containing past discrimination or redlining. That could cause an agent to steer some consumers away from credit options in violation of fair-lending laws.

Agentic commerce may also pose added fraud risk to banks. “If agents are developed by third parties and operate through merchant APIs or open banking infrastructure, banks may have limited visibility into agent behavior.”

Read more: The Identity Dilemma: How AI Blurs the Line Between Reality and Fraud

-- Article continued below --

So Far, Any Solutions are at Best Only Works in Progress

The white paper discussed a variety of potential solutions to the risks, but many aren’t likely to go anywhere in the current political and regulatory environments. This includes the idea of making the Federal Trade Commission the federal regulator of AI agents. In addition, according to the paper, some proposed solutions lack the speed necessary to address risks posed by high-speed agentic purchasing.

Some solutions that might play out in time:

An agentic AI version of RESPA. The Real Estate Settlement Procedures Act bars fees, kickbacks and the like for referrals for real estate settlement services involving federally related mortgage loans. This could be a model for broader consumer commerce.

State licensing of AI providers. The paper compares this to state-level licensing of money transmitters and lenders. It noted that some symposium attendees saw this as too burdensome on agentic AI startups.

Industry self-regulation, to set standards for agent operations.

Read next: Is AI Learning the Job Faster Than Banks Can Redefine It?

About the Author

Profile PhotoSteve Cocheo is the Senior Executive Editor at The Financial Brand, with over 40 years in financial journalism, including the ABA Banking Journal and Banking Exchange. Connect with Steve on LinkedIn: linkedin.com/in/stevecocheo.

The Financial Brand is your premier destination for comprehensive insights in the financial services sector. With our in-depth articles, webinars, reports and research, we keep banking executives up-to-date with the latest trends, growth strategies, and technological advancements that are transforming the industry today.

© 2026 The Financial Brand. All rights reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of The Financial Brand.