Shopping as a constant poker game
Danish data analytics company a2i touts fuel pricing as an ideal implementation of its learning algorithms. The company claims that PriceCast Fuel, its dynamic pricing product, can improve fuel retailers’ margins by around 5 per cent.
“With the use of Artificial Intelligence PriceCast Fuel detects behavioral patterns in Big Data (all available data relevant to the sale) and relates to customer and competitor reactions with a frequency and level of accuracy that users of traditional pricing systems only can dream about,” the company explains in a brochure [PDF]. “Dynamically mapping customer and competitor behavior in order to identify the optimal route (price setting) throughout the day, makes it possible to relate to any given change in the local situation for a given station and re-route accordingly when necessary and within seconds.”
PriceCast can do traditional rule-based pricing where, for example, the supplier wants to set the lowest price possible. But a2i can also incorporate “Restricted AI” or “Advanced AI” pricing.
The company says it isn’t ripping off anyone.
“This is not a matter of stealing more money from your customer. It’s about making margin on people who don’t care, and giving away margin to people who do care,” CEO Ulrik Blichfeldt told The Wall Street Journal earlier this month [paywalled].
a2i claims several fuel suppliers in Europe have signed on, but only one is prepared to go public. Why the shyness? Well, that isn’t too hard to work out.
Uber’s occasionally notorious “surge pricing” has established the principle that prices fluctuate in real time, but there is a logic to it. It’s designed to increase the supply of taxis on the roads from Uber’s casual labour pool. Uber can’t compel its labour to work, but since higher fares mean the drivers keep more money, they’re more inclined to go on the road. But dynamic pricing doesn’t produce more or cheaper petrol – it’s simply a case of retailers increasing margin where they can, as Blichfeldt has admitted.
Real-time dynamic pricing has long been a Silicon Valley fantasy. Here’s Affirm co-founder and CEO Max Levchin four years ago:
What do you imagine personalised fuel pricing – based on your car broadcasting that your fuel gauge is low – would look like? Levchin then jokes that we’ll see dynamically-priced priests and therapists. But it would be a bold and foolish therapist who hiked their prices after a major natural disaster or terrorist incident.
AI-powered dynamic pricing sticks in the craw because it makes it blatantly obvious that the consumer is being gamed.
The “smart” consumer will shop around, but dynamic pricing turns shopping into a 24/7 poker game – a full-time hobby, or neurosis. To the VC and AI nerds of Silicon Valley this is how it should be, squeezing every last ounce of “inefficiency” out of a marketplace.
“This is the nightmare world of Big Data, where the moment-by-moment behavior of human beings – analog resources – is tracked by sensors and engineered by central authorities to create optimal statistical outcomes,” commented Nick Carr.
But that’s a relationship many of us would rather not enter. Not for nothing have retailers, for decades, promised “lowest prices around – or your money back”. It’s nice and simple. And it offers a kind of contract: we can hold to them to account if they fail.
This is where AI offers another option to cynical retailers. How do we know if we’ve been offered the lowest price. And what happens when the pricing bots gang up on us?
“A cabal of AI-enhanced price-bots might plausibly hatch a method of colluding that even their handlers could not understand, let alone be held fully responsible for,” notes The Economist.
Dynamic pricing is an ugly idea with a couple of fairly ugly parents. One is “behavioural science”, or Nudging, which supplants an honest relationship with the customer with a tricksy and dishonest one. Nudging supposes we don’t know we’re being gamed, and is beloved of today’s policy makers.
Another is Silicon Valley’s insistence that because something can be done, we’ll just love them for doing it. Six months ago I suggested there were two major obstacles to ML adoption, even if the techniques worked as well in new use cases as the AI evangelists hoped. One was the liability issue – we need someone, somewhere to carry the can when things go wrong – while the other was consumer resistance to “helpful” suggestions, which are often creepy and unnecessary.
AI-powered dynamic pricing shows us a Silicon Valley culture determined to ignore both. Full steam ahead! ®