“Intelligence” is a word much bandied about in enterprises. Frankly few enterprises use it properly to denote what constitutes authentic intelligence.
"Intelligence is nothing if not an institutionalized black market in perishable commodities"
Master spy novelist John LeCarré
While the above notion is intended to refer to information gathered by spies regarding enemies, this sort of thinking has colored how enterprises view intelligence: the secret information that will solve all problems and kill all enemies. It doesn’t matter if you’re talking business intelligence or market / competitive intelligence. The apparent allure of the military and spy industries for enterprises has badly skewed how many companies view intelligence.
The kind of intelligence that truly benefits enterprises derives from classic definitions of the word: learning, understanding, applying knowledge and experience to new situations to make better decisions. For both business intelligence and market/competitive intelligence, the frequent catchphrase is “making better business decisions”. Both involve data, analysis, recommendations. Both can have strategic and tactical applications. Both benefit from qualitative as well as quantitative dimensions. And both should strive for authentic INTELLIGENCE, not just resultant collections of information.
Business Intelligence methodologies and tech solutions have been used to generate actionable views of a company, frequently rooted in business operations. Until recently, BI outputs have been primarily reports – some might say too many non-contextual reports, many of which may be trivial, pointless, or worse, misleading. The BI world includes analytics, data mining, text mining and predictive analytics.
And sometimes BI is considered to be ‘competitive intelligence’, hence my references to market intelligence. Market/competitive intelligence and business intelligence should be complementary, but are not the same thing. In my view, a certain amount of BI output contributes ‘intelligent data’ for market intelligence initiatives. Overall, market intelligence gleans from many ‘data’ sources from many disparate sources inside and outside the firewall, both qualitative and quantitative, while filtered by specific strategic contexts. Until recently, BI has mostly derived from practices and technology that address structured data and business processes inside the firewall. BI will now have to include unstructured or content sources in analytics, to add in missing subtleties, context, and insight that cannot be pulled from structured data sources.
Analytics Everywhere
In addition to overt BI-style analytics, many other analytics solutions have been proliferating to help make better business decisions. Web analytics were silo’d for a while as relevant only to website usages. But web analytics now have many forms and have become very sophisticated. Data captured by web analytics is now important to many teams across the enterprise and is now becoming more integrated into other “business intelligence” endeavors.
With fast-growing social media applications, SM analytics are now emerging – again the data generated is essential and must be integrated into the overall intelligence base of the enterprise. Content analytics have become even more important as unstructured content proliferates from many sources.
These seemingly diverse analytics cannot be confined to individual silos in enterprises. “Intelligence for the Business” that results from all analytics is needed at LOB / department levels, as well as at the über-enterprise level. And mid-market enterprises need such business intelligence capabilities as well. So all kinds of analytic tools need to become more accessible to many roles and organizational segments that have not necessarily been targets of such solutions before.
Unfortunately, keeping pace with the enormous proliferation of data, and even with the various analytical outputs, is overwhelming many enterprises. More integration between what began as silos of analytics must now happen. Enterprises have to get better at the behind-the-scenes / get-your-hands-dirty work that is essential to make realistic and effective use of all the data (structured and unstructured) pouring in from so many sources. The Semantic Web will likely contribute overall to the integration of structured and unstructured data to add to intelligence relevance.
Predictive Analytics – The Bold New LOB Tool
Considered a subset of BI, but also present in other categories of analytics solutions, predictive analytics may become much more in demand than traditional BI solutions, especially with the new directions of innovative predictive analytics solution vendors.
Eric Segal of Information Management on predictive analytics – citing a marketing use case:
Predictive analytics is data mining technology that uses your customer data to build a predictive model specialized for your business. …The real trick is to find the best predictive model.
…a careful combination of predictors perform better customer prediction by considering multiple aspects of your customers and their behaviors. Predictive analytics finds the right way to combine predictors by building a model optimized according to your customer data.
Predictive analytics builds models automatically, but the overall business process to direct and integrate predictive analytics is by no means automatic - it truly needs your marketing expertise.
Predictive analytics can be part of “traditional” BI offerings – it’s also part of the web analytics world. Baynote provides ‘Adaptive Web’ solutions that include analytics, personalization and social search for customer-focused relevance on websites, to help optimize engagement and retention. Baynote’s Collective Intelligence Platform provides companies with predictive analytics based on the implicit patterns of customers visiting their websites. Omniture, WebTrends and other web analytics solutions also provide capabilities for predictive analytics.
Content Analytics, Content Intelligence
Enterprises have been challenged for years on the content intelligence front – it has been very difficult and very messy to extract meaning (and thus retaining context) from unstructured content to be used as “data”. The amount of enterprise content is enormous, with geometric expansion in play. For enterprises that publish content on the web, content analytics also offer the means to better connect with target markets and customers:
"Content analytics" can be seen as business intelligence (BI) for/from content, as text (rather than number) crunching that generates insights to improve business outcomes. The two practices, content analytics and BI, certainly share motivations. If you don't analyze your content/data, you may be missing opportunities and running risks.
For content publishers, analytics drives better targeted content delivery, expanded audiences, and secondary uses and new distribution channels. These outcomes add up to profit. On the flip side, they are matched by reduced risk and cost avoidance given possibilities for more complete, more accurate compliance screening, e-discovery, and storage management.
Analytics also boosts value for users. Semantic search, faceted navigation, and content annotation/enrichment create findability and improve user experience and value for users. They also let users treat content like data. Call the goal "content intelligence," enabled by "smart content."
Seth Grimes on Content Analytics
“Intelligence” Hits The Wall
Stephen Few’s graphic for “BI Has Hit the Wall” is quite expressive of where BI – and all analytics - need to head to deliver evolving value to enterprises. Stephen Few is a leading expert in creating effective data visualizations that communicate real intelligence and don’t just function as “eye candy”:
Contained in these early definitions was the seed of an inspiring vision that caused people like me to imagine a better world, but the business intelligence industry has done little to help us achieve the vision of the people who coined the term. When Thornton May was interviewing people for his book “The New Know”, he asked a prominent venture capitalist known for his 360-degree view of the technology industry what he thought of when he heard the phrase business intelligence. His response was “big software, little analysis.”
For information to be useful, we must explore it, analyze it, communicate it, and monitor it, but the BI industry’s attempts to support these activities with few exceptions have been tragically comical. The technology-centric, engineering-oriented perspective and skill set that has allowed the industry to build an information infrastructure is not what’s needed to support data sense-making. To use the data that we’ve amassed, a human-centric, design-oriented perspective and skill set is needed.
New World Focus on BI and Predictive Analytics
Some of the significant changes to analytics solutions relate to how final artifacts are “formatted”. There has been a growing transition from traditional report formats to interactive visualization tools, and to collaborative processes that enhance the final artifacts. More companies want BI to help with current and future needs and goals, rather than measuring the past. And these companies should be inviting in more and more individuals (via collaboration) to help refine the accuracy and contributions of analytics outputs. Newer on-demand BI / analytics solutions are providing “right now” intelligence, though still these are still works-in-progress, for the most part.
A number of articles lately have made the point that BI initiatives have not been highly successful for many customers, while making another important point: that “failure” actually is part of the BI refinement cycle. But for costly enterprise-style BI solutions, “failure” can be quite expensive. Gerry Brown of Bloor comments:
However, there is little doubt that the number of BI users overall is increasing. With low-cost operators such as QlikTech, Tableau, LogiXML, Pentaho and Jaspersoft you get an awful lot of BI software for $25,000. Conversely, most enterprise BI vendors don't traditionally get out of bed for deals of less than $50,000. Low cost (or 'free') BI software encourages trial and experimentation with little risk, and this is what most enterprises prefer. After all, BI project 'failures' are common.
Referring again to the graphic for “BI Has Hit the Wall”, Stephen Few sees that “the traditional BI software vendors and most of the industry’s thought leaders are stuck on the left side of the wall” – while the future of BI will come from more non-traditional BI solutions:
The software vendors that are providing effective data sense-making solutions—those that make it possible to work in the realm of analytics on the right side of the wall—have come from outside the traditional BI marketplace. Vendors like Tableau, TIBCO Spotfire, Panopticon, Advisor Solutions, and SAS tend to either be spin-offs of university research or companies that have ventured into the BI marketplace from a long history of work in statistics.
In a very interesting and thought-provoking Gartner Magic Quadrant for BI platforms, the analysts see significant changes in the solutions sought by enterprises:
…there is significant, if not euphoric, satisfaction with, and accelerated interest in, pure-play BI platforms. This is particularly true for smaller, innovative vendors filling needs left unmet by the larger vendors. To understand this paradox, it is necessary to consider a number of factors that are driving the BI platform buying decision today.
Other vendors to watch for new ways to perform and use BI / analytics include Lyzasoft, Predixion Software and GoodData. Lyzasoft and Predixion recently became partners for cloud to cloud analytics-as-a-service. GoodData provides BI PaaS to help speed up solution deployments.
BI SaaS adoption, while very low today, will grow steadily as maturing BI SaaS solutions are delivered in private and public clouds and in on-premises and off-premises configurations by trusted vendors. …innovative, pure-play vendors offering highly interactive and graphical user interfaces built on alternative in-memory architectures to address their unmet ease-of-use and rapid deployment needs. The perceived benefit is so compelling that business users are making this choice, despite the risk of creating fragmented silos of applications and tools.
Business users in particular showed a growing impatience with the time to deploy and complexity of traditional enterprise tools, which led to a rise in departmental buying of alternatives.
Collaboration and the Human Element
One newer area of interest for BI / analytics solutions is the inclusion of collaborative activities to add contextual and qualitative layers to the output of BI processes. To achieve authentic intelligence, contextual / qualitative layers can provide strong basis to test, fine tune and filter the artifacts of analytics. Analytics can benefit greatly from human filters that bring experience, knowledge, creative thinking. Context has a big role here: context for sources, context for outcomes, context for usage with other data points to achieve the best Intelligence for “making better business decisions”.
Collaboration is far more than distributing and sharing documents. It is interactive, inclusive, cross-enterprise. If collaboration is to be part of analytics, then iterative collaborative processes should be established throughout analytics cycles. Collaboration for analytics means bringing in disparate people to test assumptions, validity of data sources, accuracy and relevance of the outputs. These additional participants should provide a wealth of experience, complementary concepts and other essential data and perspectives. Having to prove the relevance and accuracy of analytics results to sympathetic as well as less sympathetic individuals should strengthen BI / analytic processes and projects.
The possibilities for new applications of analytics increase with collaboration. Inviting in many-to-many interactions also opens up processes to new ideas from participants. Gartner found that social venues and collaboration help to track and capture outcomes of the decisions made based on BI / analytics:
Gartner's user surveys show that improved decision making is the key driver of BI purchases. However, most BI deployments emphasize information delivery and analysis to support fact-based decision making, but fail to link BI content with the decision itself, the decision outcome, or with the related collaboration and other decision inputs. This makes it impossible to capture decision-making best practices. Solutions are emerging that tie BI with social software and collaborative tools for higher-quality, more transparent decisions that will increase the value derived from BI applications.
Vendors embracing collaboration as an essential part of BI / analytics include Lyzasoft, Predixion and Tibco Spotfire. Keep in mind that these vendors are all in early stages of building in collaboration as a significant aspect of their analytics solutions.
Analytics Aren’t Always Fun and Games
While there are very interesting analytics solutions that are “friendly” to business users, it is essential that safeguards are in place to ensure that business users understand what they are doing with analytic models and whether the resulting “intelligence” artifacts are correct, meaningful and useful. Collaboration with others who know the data, understand analytics, visualize the big picture, and so on, is one safeguard to ensure reliable analytics outcomes and correct usages.
In a particular enterprise, do enough people know what to do with analytics, both to start processes in meaningful ways and to audit outcomes to validate accuracy and relevance? Are users chasing the right problems or questions, providing the right data sources, including enough pieces? Do they have the understanding to work in a “big-picture” sense? Yes, analytics vendors should build in methods for validation, testing, guidance, but is that enough? Bob Warfield addresses this concern:
Few enterprises have the right analytical talent.
I was musing not long ago with VC and fellow EI Evangelos Simoudis that very few people actually know how to ask questions in a way that solves problems. It is something of a Sherlock Holmes conundrum. All the data is available. It is shatteringly obvious once someone connects the dots. Yet, very few know how to step across the stones that peek above the raging torrent of data to get to the other side where the answer lies without falling in and getting wet.
“You see Watson, but you do not observe.”
Beyond the traditional users of BI / analytics, there is an additional “challenge” to vendors to reach potential users who “need predictive analytics but don’t know it”. The challenge comes in two parts: first to find them and connect meaningfully with them; and then to provide guidance for every step of using the analytics solution to ensure proper outcomes. To connect with these “new” analytics users, vendors have to help them understand why they need analytics and how to use it. Self-serve training modules must be available for using the solution but also for understanding what analytics mean to the user’s role, industry and business needs. Detailed templates and best practices organized by industry scenarios are also important for success. It will be interesting to see how successful various vendors are for empowering such non-traditional users.
The Elephant in the Room: Reliable Structured Data
Although I’m not addressing data warehousing in this article, I do want to point out the need for basing any data-driven intelligence / analytics processes on reliable data. While it’s the right direction to provide analytics tools that work well for business users and increase ease-of-implementation and ease-of-use, diligence must be exercised to guarantee that data sources fed into analytic processes are trustworthy. Otherwise GIGO.
Teams comprised of IT and business must collaborate to create and maintain excellent processes for data profiling, data cleansing / quality, and data integration. The business users know how source data is used and the context for analytic processes. IT is the partner to help with the data profiling, data quality and integration steps. Data profiling tools can be used by business users to audit data sources, test for errors and/or for the lack of the right data, and to understand different data sources.
"Intelligence is nothing if not an institutionalized black market in perishable commodities"
A lot of data and content that matter for better “Intelligence for the Business” come with a short shelf life of significance. The new direction for many analytics solutions should bring this information more quickly into intelligence processes. When enterprises succeed in achieving richly integrated knowledge and context, authentic INTELLIGENCE is likely to result for “making better business decisions”.
Related Links
Market Intelligence – Making Better Business Decisions at the Core of the Company
Shallow Thinking Ensures Bad Business Decisions
Levels of Social – and Integration – Cut Across Enterprises
The incongruous worlds of “data integration” – Making data accessible for any-size companies
Content Marketing Strategy for B2B Software Vendors: Starring the ‘New’ White Paper
Disclaimer: Julie Hunt is not affiliated with any of the vendors in this article.
About the author: Julie Hunt is an accomplished software industry analyst, providing strategic market and competitive insights. Her 20+ years as a software professional range from the very technical side to customer-centric work in solutions consulting, sales and marketing. Julie shares her takes on the software industry via her blog Highly Competitive and on Twitter: @juliebhunt For more information: Julie Hunt Consulting – Strategic Product & Market Intelligence Services
Only recently have enterprises begun applying text analytics to in-depth market research content in order to distill "strategic scenarios", thus truly advancing the competitive intelligence function with automated analysis (what I call "meaning extraction"). In order for such analysis to have maximum value, it must be conducted against *all* of an organization's market research assets -- including primary and licensed secondary content, plus whatever news sources and industry or government databases are relevant to the company and its industry -- and it must be possible to conduct a unified search of that heterogeneous repository so all the relevant documents are part of the analysis. None of this is easy to do for many reasons -- and those companies that are doing it successfully have a significant leg up on their competition. Without "meaning extraction" technology in place, a common-variety search engine applied to a market research database of rich customer information puts an enormous burden on the user to read enough of the reports to gain an overall understanding of a topic like, “What are my customer’s strategic priorities right now?”
Posted by: David Seuss | 11/03/2010 at 01:52 PM
Hi David,
I appreciate your taking time to read my article and leave your comment.
These are very interesting and challenging times for many kinds of analytics, with a great many tech innovations in play. The participation of humans in analytic initiatives remains equally important.
Cheers,
Julie
Posted by: Julie Hunt | 11/04/2010 at 12:27 PM
Great stuff, Julie.
This struck me:
**Although I’m not addressing data warehousing in this article, I do want to point out the need for basing any data-driven intelligence and analytics processes on reliable data.**
How many times have you seen people spend tons of cash on fancy tools that contain suspect data?
Talk about putting perfume on a pig....
Phil
Posted by: Phil Simon | 11/30/2010 at 09:01 AM
Hey Phil! Glad you stopped by and had some time to read this article.
Thanks for adding your excellent point re: ascertaining - and maintaining - data quality before building out solutions.
Cheers,
Julie
Posted by: Julie Hunt | 12/02/2010 at 01:59 PM