Optimization In Business: People Over Algorithms
Chris Knerr is a CxO, Senior Business Strategist, Trusted Advisor, Fortune 50 Executive, Entrepreneur and Change Agent.
We often have good instincts about who to talk to — strangers, I mean. It’s unclear why we feel drawn to some people and disinclined to interact with others. For the sake of argument, let’s agree it’s intuition and that our intuition, at least some of the time, is valid and valuable. And if we’re wrong, in most non-edgy business or social situations, the likely worst case is we’ll shrug it off and wonder why we wanted to talk to that person.
I was recently staying in a moderately trendy, mid-sized hotel in Boston’s Seaport neighborhood. While waiting for the elevator, I struck up a conversation with a man. He was also by himself, and I inferred he was there on business, too.
If you haven’t spent time in larger buildings with multiple elevator banks, you may not realize that newer elevator systems have built-in scheduling algorithms to minimize wait times. Rather than pressing “up” or “down,” you enter a specific floor. This provides what my friends in the supply chain call a “demand signal.” The algorithms aggregate the signals to determine an efficient schedule.
Rather than just waiting where the last passenger disembarked or rotely returning to the ground floor, each elevator car hovers near where it’s expected next based on historical patterns or predetermined rules. In the morning rush hour, the cars will hover near the ground floor. They’ll group together passengers going to higher floors so that other cars can jet up to the low or mid floors with the appropriate passengers and get back to the ground floor quickly to serve peak demand.
It’s an elegant and logical system. And since most people don’t enjoy waiting around for elevators, it certainly appears to maximize utility, as my economist friends say. My new Boston almost-friend and I were staying on different floors, so two separate elevator cars came for us. That was the end of our conversation. Our schedule was optimized. I was optimized.
And yet, if we’d had a chance to chat for another 30 seconds, what would’ve happened? Would one of us have made an elevator speech? Would we have ended up getting a drink in the bar? Perhaps he would’ve invested in my company or signed a six-figure software deal. All these possibilities were annihilated, of course, in order to save us a few seconds of elevator transit time.
Behavioral economics research in choice architecture has proven valuable in designing default choices and the information we use to make choices to drive desirable outcomes. Famously, for example, in promoting personal savings, most 401(k) plans are now set to opt into a prescribed percentage of income to take from each paycheck as the default; conversely, they require a conscious choice to opt out of the default. This has the (good) outcome of improving savings rates.
What bothers me about the elevator episode is the design of a default choice architecture that’s myopically focused on mechanical efficiency at the expense of all other desirable outcomes, like human interaction. What’s further striking is that we experience this kind of optimization more or less 100% of the time on the web and social media. We’re constantly steered — optimized — toward some outcomes and away from others, based on rules defined by and for someone else.
And these so-called optimizations are certainly not always, or even often, in our best interest. In fact, they’re generally designed to keep our attention on something, away from something or to better monetize us. We often don’t notice these virtual systems of algorithmic control for the most part. I noticed it, in this case, because I was physically steered away and separated from my almost friend.
It’s worth noticing when we’re being optimized, if we like the outcomes the invisible hands of algorithms are steering us toward, what choices have been truncated for some supposed efficiency or tacit monetization of our attention. The rules and the outcomes of optimization are created by people. They can be and should be humanized and created for people as well.
As algorithmic thinking and systems become even more pervasive, designing transparency and choice into them is paramount in business. Active discussions about what you’re trying to optimize and whether it’s actually what you want are key. Designing systems and interfaces where “suggestions” can be turned off is a great start. For example, I don’t appreciate it when my streaming services demand “watch something” or “play more like this,” and I can’t disable it. I don’t ever want my next choice to be determined by an algorithm without a chance for me to weigh in. It’s okay if some people want this, but when that so-called choice is invisible and can’t be changed, that’s a process design pattern companies should avoid.
This transparency and choice impulse needs to span operational process/systems design and C-level vision and planning. Business people and entrepreneurs are steeped in the culture of optimization: How do we improve gross margins, reduce cycle times, retain our best people and so on? It’s incumbent on us to focus on new growth and innovation, in addition to just optimizing the baseline. So, when doing business planning, ask about top-line growth and not just cost containment. When working on employee retention, make sure your systems leave space for a real human conversation and understanding and not just a score based on a rubric.
Business is about new growth and innovation. While these can be supported by optimization, they arise from our own human creativity, and as leaders, we need to design processes and organizations that create space and oxygen to keep us — not the algorithms — in the driver’s seat.
Forbes Business Council is the foremost growth and networking organization for business owners and leaders. Do I qualify?