Major corporations have spent so much on technology over the years that they tend to think their systems are a lot more sophisticated than they really are, argues Steve Sashihara, the CEO of Princeton Consultants, Inc. and the author of “The Optimization Edge: Reinventing Decision Making to Maximize All Your Company’s Assets” (McGraw-Hill, February 2011).
“The biggest mistake we’ve seen is the assumption lot of company leaders have that they have made such massive investments in IT that they must have optimization all over the place,” Sashihara says. I interviewed Sashihara several months ago but only recently got around to finishing his book (I have my own business optimization issues, you see).
One of his most interesting arguments is that a great deal of the effort spent on information gathering and analysis is wasted — or, at least, used sub-optimally — when it’s used to feed business intelligence systems that produce reports that ultimately wind up with being fed into spreadsheets and PowerPoint slides. Managers then sit around in a conference room listening to presentations and debating what the data means and what decisions should be made about it — when, in many cases, good software could make the decision itself. The GPS in your car is optimizing when it says “turn left at Main Street” rather than presenting you with a list of possible routes.
“What makes a piece of software optimizing is that it makes a recommendation,” Sashihara says.
That’s not to say an optimizing system can’t be wrong. Sometimes your GPS gets confused and sends you down the wrong street. If you don’t entirely trust a corporate optimization system to make a decision, you may instead want to have it present a ranked list of recommendations. That’s essentially what Google does when it searches the web and presents you with a list of the pages it considers the most likely match for your keywords. The IBM Watson computer system that won on Jeopardy earlier this year also used an internal system of ranking the best matches for a given question, although it ultimately made it’s own best guess at the answer (see Computerized Jeopardy Champ Shows IBM What Is Analytics?). Some of the best optimization software gives users a way of seeing how it arrived at its conclusion so they know how far to trust it, Sashihara said.
Some industries have made extensive use of optimization software for specific business problems, such as supply chain optimization, transportation management, and price optimization for retailers. But Sashihara argues businesses have barely begun to tap the potential of the technology to be applied to any problem of limited resources — whether those resources are people, fuel, space on a shelf or in a truck, hours in the day, or dollars in the bank.
Sashihara uses the term “optimization” as one he thinks resonates with business leaders, but the discipline is more formally known as operations research. His book traces the beginnings of the field to military planning for World War II and the application of algorithmic approaches for iterating over many possible uses of resources to find the optimal combination. With computerization, these mathematical analysis techniques came to be applied to ever larger data sets. The airline industry applied these principles to yield management, allocating seats and calculating prices to fill planes and drive profitability.
The current sate of the airline industry is proof that optimization software is not a silver bullet, Sashihara acknowledges. But maybe the airlines ought to be looking for new things to optimize. One of Marriott’s recent optimization successes came when it took the yield management techniques it had applied over many years to the pricing of individual rooms and brought them to the pricing of group deals for weddings and conferences, which previously had not benefited from that kind of automation. This entailed working out the right formula for pricing group rates based on how far in advance the reservation was being made and what other revenue a hotel might be giving up by selling a large block of room and potentially locking out more profitable customers. Marriott attributed a $46 million boost in revenue between 2008 and 2009 to this change, according to Sashihara.
One way operations research can go wrong is it goes a little too far in the direction of research and fails to concentrate on delivering real results. The operations research group at UPS was nearly shut down at one point in the early 1990s because it had failed to deliver any significant bottom-line results in years. After achieving success with a series of skunkworks optimization projects, UPS had created a formal department and flooded it with money, encouraging the internal consultants to pursue “moonshot ideas.” Once the group refocused on delivering incremental improvements like changing the way boxes were packed and trucks were loaded, the effort was much more successful.
No, not all business decisions can or should be automated. “Most optimization is at its prime making lots and lots of repetitive tactical decisions — turn left here. If the question is, ‘Should we buy our biggest rival?’ that doesn’t lend itself to optimization,” Sashihara says.
Still, he believes many executives are so enamored of their decision-making prowess that they resist the thought of offloading even the routine decisions to a computer system — even where the decision can be reduced to a mathematical calculation where a piece of software can examine a much larger number of possibilities and make a decision free of human biases. Another thing holding back broader applicability of optimization to new business processes is that relatively few people have the required skills or the experience with optimization software engines like COIN-OR, an open source product, or IBM’s ILOG CPLEX Optimizer.
At the same time, one of the constraints that has gone away is access to computing power. It used to be only those organizations with access to a mainframe or a supercomputer could tackle ambitious optimization problems. With the advent of cloud computing, that obstacle has gone away. You can rent a thousand servers from Amazon Web Services or one of its competitors and have each of those servers chew on a piece of the puzzle and send back its results, using something like the Hadoop open source system for distributed data analysis.
This style of data analysis without a relational database is becoming increasingly important, particularly for real-time analysis, Sashihara says. “By the time you’ve figured out how to load the data into Oracle, the moment is probably gone.”
If you have been asking yourself what to do with the cloud, here is your answer, Sashihara says. “What’s a good app for it? Optimization is a good app. It’s tremendously processor intensive, so the challenge is to make the analysis run in a tractable amount of time.” But given enough processors to work with, it’s not such a challenge anymore.
Article originally published by the Forbes
Cubility are the trusted advisor to some of Australia’s largest oil and gas, mining, utilities and public companies. We help ensure your company is operational ready and business effective through modern technology strategies, program management and IT support.