Skip to Content

When AI Becomes a Salesperson: Who Holds the Liability?

Imagine you run a mid-sized health and beauty retailer. To drive growth, you integrate ChatGPT’s new shopping feature into your website so customers can get instant product recommendations. 

One customer types in: “What’s the best natural cream for eczema?” The chatbot confidently recommends a product stocked in your online store, describing it as safe and effective. The purchase is made. 

Days later, the customer suffers a severe reaction. They lodge a complaint and seek damages. The manufacturer insists the product was properly labelled. The customer argues they bought it because the AI framed it as safe. The AI provider claims it only generates responses and doesn’t sell products. 

Now your business is caught in the middle. Is the liability yours, the manufacturer’s, or the AI provider’s? Your brand is on the line, your legal team is scrambling, and your insurer is asking which policy (if any) should respond. 

What began as a growth initiative has become a governance challenge. And the lesson is clear: when AI takes the role of a salesperson, liability no longer sits in neat categories. 

When AI Becomes a Salesperson: Who Holds the Liability? _ Hero

The New AI Sales Channel

Integrating AI into commerce does more than speed up the transaction. It reshapes the role your business plays in the value chain. You are no longer just stocking and selling products but you are hosting a digital intermediary that actively shapes what customers choose. 

Take the eczema cream example. The chatbot didn’t simply display products; it described one as safe and effective. In the customer’s eyes, that sounded like professional advice. 

So where does that leave the business? 

  • If the AI is seen as giving advice, does the retailer carry a duty of care? 
  • If the AI elevates one option over another, is that marketing, endorsement, or sales? 
  • If the product harms a customer, will courts treat the business as a retailer, adviser, or platform provider or all three? 

Each classification comes with different regulatory obligations, contractual exposures, and insurance triggers. And today, there is no settled definition.

When AI Becomes a Salesperson: Who Holds the Liability? _ Infographic

Data as an Exposure Multiplier

In the eczema cream case, the chatbot wasn’t guessing. It based its recommendation on data such as customer profiles, browsing patterns, even health-related queries. That personalisation makes AI shopping powerful, but also risky. 

The more data an AI touches, the bigger the exposure if something goes wrong. A breach of the platform could compromise not only names and payment details, but sensitive health information. 

From an insurance lens, this changes the conversation. The exposure isn’t just whether the product harms a customer but it’s whether the act of recommending it creates cyber, privacy, and reputational risks that stack on top of product liability. 

Algorithmic Negligence

Negligence law is built around human actors. A pharmacist who gives unsafe advice can be held accountable. But what if the advice comes from a chatbot? 

In the skincare example, the AI described a cream as “safe and effective.” If a customer relied on that statement and was harmed, negligence could be argued. But against whom? 

  • The retailer, for integrating the AI into its platform? 
  • The manufacturer, for supplying the product? 
  • The AI provider, for generating the language? 

This is the emerging gap between human duty of care and algorithmic outputs. Until courts test it, businesses risk exposure to claims they never anticipated and insurers face uncertainty over which policies should respond. 

Regulation Playing Catch-Up

Australia’s consumer law covers misleading conduct, product safety, and disclosure obligations. But none of it was written with algorithms in mind. 

In the eczema cream case, the AI made a recommendation that sounded like professional advice, yet it appeared without disclaimers, context, or regulatory oversight. Does this count as misleading conduct? Marketing? Advice? 

Until regulators provide clarity, businesses are operating in a legal grey zone. A dispute could end up being tested under multiple liability frameworks at once: product liability, professional indemnity, and cyber. Each has different triggers and exclusions and none was designed for AI-driven sales. 

Knightcorp Point of View

AI in commerce is advancing faster than governance frameworks. That speed creates blind spots for businesses eager to adopt the technology without fully considering the exposures. 

Insurance evolves after catastrophe, but real resilience comes from anticipating blind spots before they break. For boards and executives, that means asking sharper questions now: 

  • If AI mis-sells on our platform, who carries the liability? 
  • Do our existing policies respond, or do we face coverage gaps? 
  • What happens if customer data is compromised through AI-driven sales? 
  • Are we treating AI commerce as a compliance issue, or as a governance-level risk? 

Closing Reflection

AI shopping is not just a feature. It is a liability frontier where accountability is unclear, sensitive data is exposed, and regulation is still catching up. 

For business leaders, the takeaway is simple: growth opportunities always bring new exposures. The companies that succeed will not be those who rush to adopt without foresight. They will be the ones that build resilience into their strategies, test their protections, and prepare for the disputes that will inevitably follow. 

Because when AI becomes the salesperson, liability doesn’t just shift but it multiplies. 







DISCLAIMER: This information is provided to assist you in understanding the risks, implications, and common considerations for your industry.  It does not constitute advice and is not complete. Please contact Knightcorp Insurance Brokers for further information. 

DISCLAIMER: Third-Party Links: This article may contain links to external websites or content operated by third parties. These links are provided for your convenience and information only. Knightcorp Insurance Brokers does not endorse, approve, or accept responsibility for the content, accuracy, or security of any externally linked site. 

Please note that any third-party websites are not covered by Knightcorp Insurance Brokers’ policies or procedures. We make no representations about the suitability of any information, product, or service contained on those sites for your needs. Use of third-party links is at your own risk, and we recommend reviewing their terms, conditions, and privacy policies. 

For advice specific to your insurance needs, please contact your Knightcorp Insurance Brokers directly.