Surge Pricing, Artificial Intelligence, and Responsibility

On my first work trip to Jakarta 14 January 2016 for Grab, multiple terrorist bombs exploded a couple of miles from the GrabBike office where I had just arrived. People were fleeing cafes and restaurants around the attack site. My new colleagues were shaken, glad to be safe, looking to help. There was news of crowds on the streets trying to get away, confirmed by a spike in booking requests from the blocks around the explosion. My colleagues remembered the 2002 Bali bombings, and knew we should get people to spread out. And we knew that our algorithms would treat this like rush-hour demand and activate surge pricing. People needed to be evacuated, but left to its own devices, our AI would have discouraged them with a higher price.

We should make trips away from the surrounding blocks free.The whole staff (including Anthony Tan, the CEO) were clear on that goal. I don’t know how many app notifications, vehicle type configs, price settings, promo code workarounds, driver incentives, press communications people set up — but they figured it all out, leaping into a sudden disaster response situation in the space of a few minutes. And it worked.

In Texas right now, a week into a freezing natural disaster, we’re hearing of customers being charged 75 times the normal rate for electricity (average is about 12c per kWh, the peak was set at $9). At this price, the average US household would pay around $8,000 a month for electricity. Instead of just canceling this, there are various proposals for the state or federal government to provide “disaster relief” by paying such bills. Or maybe the next step is a class action lawsuit.

This is not just an absurd failure of government in its lack of planning, regulation, and incentive setting. It’s a cowardly attempt by the private sector to profit from its own ineptitude. If not by malice, then by unaccountability.

I learned more about surge algorithms at Grab, partly because pricing and discounts were closely related to my work launching GrabShare. As data scientists and engineers, we had a duty to help make pricing equitable for passengers and drivers. Topics included nonlinear surge, geographic smoothing, time-of-day buckets, price elasticity, featurization, and machine learning. None of these are silver-bullets — they are tools in a system that also includes many human-oversight safeguards. These safeguards include basics, like hard limits on surge price multiples — these vary from city to city, but to give you a sense, we’re talking about numbers like 2 and 3, not 75 or 100. We have clear rules about price stability and guarantees — if we quote a $10 ride, the passenger gets a $10 ride. If that price in the meantime becomes inequitable for the driver because traffic and demand is up and the surge is larger, then the system messed up and we need to fix it. We don’t pass the buck (or the bill) on to the passenger in the meantime.

Equitable pricing isn’t just applying a known set of rules, it’s a practice of thoughtful and watchful diligence. This is increasingly crucial in AI where it is everyone’s responsibility to anticipate and prevent unintended consequences. A typical example — of course we had the idea of using destination as a feature for predicting an acceptable surge price. This came up as a suggestion so many times that a hypothetical “hospital example” became a repeated reminder — we would never build a system that might “learn” that people whose destination is a hospital tend to be willing to pay higher fares. The claim that “nobody really understands how AI systems make individual decisions” would be a lie to hide behind: when we properly stop and think when adding a new feature to a machine learning algorithm, we can normally come up with hypotheses to test quite easily, including bad outcomes that must be prevented.

Those are just some of the ethical considerations within AI and machine learning and only one part of an organization responding to a crisis. During Grab’s Jakarta bombing response, the automated pricing surge algorithm was just switched off — we knew that part would do harm. For those few hours, business-as-usual was no more, and it was mainly the finance, promotion, and driver operations leaders scrambling to improvise together. The office went from confusion to celebration: high-fives as we heard the first reports of passengers getting away to safety; press reports that our rival Go-Jek had joined us in the effort; that day we were all working together and it was working. And of course, the commitment this inspired in me about Grab was incalculable.

Contrast this with what we have heard about electricity providers in Texas. The surge was allowed to skyrocket up nearly 100-fold, just when people needed power most to survive. Some may argue “that’s how market forces work”, as if that makes the outcome right rather than making the design wrong. Ambulance drivers are often in situations where people need them desperately — but “market forces” don’t make it ethical to charge injured people 100-times the going rate for an ambulance ride. There may be some who argue that an auction rather than medical risk should be used to prioritize people for a scarce vaccine. Most of us disagree.

So what we’re left with instead is the excuse of organizational helplessness. “Nobody wanted it this way, it just happened.” That’s rubbish. Someone has a password to a database and could go in and cancel those outrageous bills right now. There are heads of department who could demand such action and answer to their CEO. There are CEOs of energy companies who could have ordered on day one of the disaster “We provide an essential service that people need to survive: bills will be cancelled and payment figured out later when it’s safe”. They haven’t done this yet. Instead we’ve heard politicians say they’re trying to figure something out — even including using disaster relief money to line the pockets of these companies.

Wake up, American businesses. This isn’t about “preventing socialism” or “the sanctity of market forces”. Don’t treat your customers as hostages. That’s evil, and companies that do this will ultimately fail. Businesses are not just lowly enforcers of runaway algorithms, so don’t pretend that we are. If anyone reading this can implement proper remediation steps for any of the power companies involved: please seize this opportunity. You can be a private-sector hero today.

Works at LivePerson on AI and software engineering, particularly natural language processing. See http://puttypeg.net for more.