I am sitting on a train to Düsseldorf on my way back from Paris, where I presented an update of what we are doing at arcplan to a mixed audience of customers and prospects. Part of my presentation included the usual content of company and product development updates. The outlook included a preview of our next release, code named Xenon, in the context of what is happening in businesses these days. One of the key topics was the explosive appearance of mobile devices and the challenges this poses to organizations – different form factors and operating systems, security issues, and expectations from a user community that is educated by the private consumption of applications on these devices (bringing an expectation of usability to the business environment). Of course, I introduced our first-ever approach in the business intelligence world to solve the dilemma of catering to this ever-increasing diversity of different device types and form factors as DORA: Develop Once, Run Anywhere. This is accomplished by responsive design for business intelligence and analytics applications. The audience was clearly impressed as was our customer advisory board in a similar session last week.
However, this blog article is not about how to develop and deploy analytic content effectively in this new world; it’s about the business value BI solutions create.
This year we were positioned by Gartner in their annual Magic Quadrant for Business Intelligence Platforms. Although the Gartner analysts expressed strong appreciation for our capabilities (and commented accordingly in the strengths and cautions section of the report), we are positioned at the lower end of the niche vendor section. We were told this is partially due to self-service analytics and data discovery playing a strong role in this year’s Quadrant as this represents advanced BI. Really?
Everyone is throwing around the term “analytics” – about as much as they’re throwing around the term “big data.” While I might put big data on my list of the Most Overused Phrases, analytics gets a pass. As companies realize the amount of insight and value they can glean from their ever growing volumes of data, there has been a surge in analytics initiatives. The goal of these projects is to use data to analyze trends, the effects of decisions, and the impact of scenarios to make improvements that will positively impact the company’s bottom line, improve processes, and help the business plan for the future.
In order for analytics to remain relevant and always provide value, organizations must continually up their game. One way to do this is with predictive analytics, which is becoming more mainstream every day. If you stick around to the end of the article, I’ll tell you a simple way to bypass its complexity and still get the predictions you need.
Gettin’ Predictive With It
Predictive analytics involves making predictions about the future or setting potential courses of action by analyzing past data. A 2012 benchmark study by Ventana Research revealed that predictive analytics is currently used to address a variety of business needs, including forecasting, marketing, customer service, product offers and even fraud detection. While predictive analytics used to be in play in only a small number of companies, two-thirds of companies participating in Ventana’s survey are using it, and among those, two-thirds are satisfied or very satisfied. These results indicate the maturity that predictive analytics has undergone over the last few years, as technology has advanced to make it less expensive and more approachable, and therefore easier for more areas of the business to make use of. At this point, it’s safe to say that most Fortune 500 companies are churning out predictive insights on a regular basis, but that doesn’t mean smaller companies without “big data’ can’t do the same thing. They can supplement their internal data with external data from social media, government agencies, and other sources of public data to get the insights they need.
Let’s take a look at finance institutions, which have predictive analytics down to a science….
I just got back from the Teradata PARTNERS Conference in Washington D.C. – once again a a great event for learning from experts in the industry, listening to real-world examples on challenges with managing and leveraging huge data volumes, and networking with our fellow Teradata partners and customers alike.
It was my second consecutive year at the event, and what struck me most this year was that the topics have clearly shifted from managing big data to leveraging big data. Obviously, data volumes are exploding due to social media and clickstream data, sensor data and other sources and will only continue to grow. This year’s conference, however, was all about Analytics – how to use those data to drive business benefits. And there were great examples given at the conference.
In one of his presentations, Stephen Brobst, CTO of Teradata, described the benefits of collecting weather data around retail stores to determine whether conditions have a significant impact on food consumption in the store (e.g. the deli section). He said combining external weather forecasts with internal operational data and analytical information allows stores to adjust staffing and supplies for a huge impact on the bottom line.
Shaun Connolly, Program Director of Global Industry Solution at Teradata, described an example of how FedEx was able to save $60 million in staffing per year…
Fueled by the big data hype and the need to extract greater business value from data, investment in business analytics software is on the rise. Many companies have begun to tap into the potential of big data analytics and this number is predicted to increase according to recent reports by the International Data Corporation (IDC). IDC forecasts that the market will continue to grow at a 9.8% compound annual growth rate through 2016 to reach $50.7 billion. Perhaps to a less aggressive extent, interest in Collaborative BI is also on the rise, with top performing companies incorporating collaborative techniques to share knowledge throughout the enterprise according to Aberdeen’s extensive 2011 research report on Collaborative BI. The demands for agile insight and self-service are changing the landscape of BI, driving the need for Collaborative BI, which uses social functionality to improve business decision-making. Separately, the benefits of deploying analytical tools and taking advantage of collaborative techniques are appealing for any organization seeking streamlined operational success – but the payback of merging these initiatives could be even more rewarding.
Analytics is gaining traction in the BI arena due to the need to explore massive amounts of varied information (what we now call big data), extract valuable insight, and quickly deliver these insights to the users who need it. Initiatives geared toward improving analytics utilize technology that gathers and organizes data from disparate data sources and provides a platform for in-depth analysis, yielding benefits such as improved business operations and agility, increased sales, and lower IT costs. So it’s no wonder that organizations are making significant investments in the analytics market.
Collaborative BI, on the other hand, seems to be the new kid on the block…
You know the gut feeling that leads you to take a different route to work or accept one job over another? Those gut feelings may have led you on the right path, but they’re personal decisions where you have only so much information (a traffic report on the radio or both companies’ financials) and you would expect to make your decision based on gut instinct. These personal choices affect only you and potentially your family in the case of a new job. But relying on gut instinct alone in your business life is a mistake – there’s simply too much supporting evidence to take into account when making business decisions (decisions that affect much more than just yourself). Why play Russian roulette with these decisions when you’re surrounded by analytics?
Sound business decisions are based on facts, data analysis, trend spotting, or other complex calculations, and yes – a bit of intuition. But your instinct should be used as an indicator, not the basis for your decisions. In every business there are variables and unique scenarios that make planning and analysis imperative; neglecting these factors could have serious implications. Consider this example: The 2010 Report of Anton R. Valukas examined the demise of Lehman Brothers, a formerly dominant global financial institution that went bankrupt during the recent financial crisis. It revealed that the company excluded some assets from routine stress performance calculations (meaning the company couldn’t know how much money it was in a position to lose because it was not performing what-if analysis) and valued some real estate investments on a combination of financial projections and “gut feeling” according to a Lehman Brothers vice president. In essence, the company’s business practices lacked analytic insight, or at least the will to get it. There is no doubt that Lehman Brothers had access to multitudes of data on its assets, on the market, and on its level of risk. Armed with this information, I’d hope executives would have made better choices, taken on less risk, and valued their assets more realistically.