You are on page 1of 16

April 2009 $199.

00

N e x t - E r a B I Te c h C e n t e r

Insight at the Speed of Thought:


C o n t e n t s
2 About the Author 3 Executive Summary 4 In-Memory Advantages 4 Figure 1: Not Seeing Eye-to-Eye on BI 6 Figure 2: Query Approaches Compared 8 Figure 3 : A Faster Approach to Filtering Data 10 Figure 4: 32-Bit and 64-Bit BI Deployments Compared 12 Whos Who in In-Memory Business Intelligence 14 Figure 5: Data Exploration With a Visual Difference

Ta k i n g Ad va nt a g e o f I n - M e m o r y An a l y t i c s

Fast analysis, better insight and rapid deployment with minimal IT involvement: These are the leading benefits of in-memory analytics, but different products are appropriate for different environments. Heres how to choose from among a growing list of in-memory technologies.
By Cindi Howson, BI Scorecard

Insight at the Speed of Thought


I n f o r m a t i o n We e k a n a l y t i c s . c o m

N e x t - E r a B I Te c h C e n t e r

Cindi Howson is the founder of BI Scorecard, a Web site for indepth BI product reviews. Cindi has been using, implementing and evaluating business intelligence tools for more than 15 years. Cindi Howson, BI Scorecard She is the author of Successful Business Intelligence: Secrets to Making BI a Killer App and Business Objects XI R2: The Complete Reference. She teaches for The Data Warehousing Institute (TDWI) and is a frequent speaker at industry events.

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


I n f o r m a t i o n We e k a n a l y t i c s . c o m

N e x t - E r a B I Te c h C e n t e r

Executive Summary
3 April 2009

There was a time when the select few business intelligence users within your organization were happy to get a weekly report. Today, smart companies are striving to spread fact-based decision making throughout the organization, but they know they cant do it with expensive, hard-to-use tools that require extensive IT hand holding. The pace of business now demands fast access to information and easy analysis; if the tools arent fast and easy, business intelligence will continue to have modest impact, primarily with experts who have no alternative but to wait for an answer to a slow query. In-memory analytics promise to deliver decision insight with the agility that businesses demand. Its a win for business users, who gain self-service analysis capabilities, and for IT departments, which can spend far less time on query analysis, cube building, aggregate table design, and other timeconsuming performance-tuning tasks. Some even claim that in-memory technology eliminates the need for a data warehouse and all the cost and complexity that entails. Theres no doubt that in-memory technology will play a big part in the future of BI. Indeed, vendors ranging from Microsoft and MicroStrategy to Oracle and IBM are joining the in-memory bandwagon. Yet no two products deliver in-memory capabilities in the same way for the same business needs. This report helps you understand the value of in-memory technologies, identify candidate applications and products, and consider your organizations readiness for in-memory BI.

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

In-Memory Advantages
Business users have long complained about slow query response. If managers have to wait hours or even just a few minutes to gain insights to inform decisions, theyre not likely to adopt a BI tool, nor will front-line workers who may only have time for gut-feel decision-making. Instead theyll leave the querying to the few BI power users, who will struggle to keep up with demand while scarcely tapping the potential for insight. In many cases, users never ask the real business questions and instead learn to navigate slow BI environments by reformulating their crucial questions into smaller queries with barely acceptable performance. Such was the case at Newell Rubbermaid, where many queries took as long as 30 minutes. An SAP ERP and Business Warehouse user, the company recently implemented SAPs Business Warehouse Accelerator (BWA), an appliance-based in-memory analysis application. With BWA in place, query execution times have dropped to seconds. Users are more encouraged to run queries that sum up company-level data, which may entail tens of millions of rows, yet theyre not worried about killing [performance], says Yatkwai Kee, the companys BW administrator. Business users can now quickly and easily analyze data across divisions and regions with queries that previously would have been too slow to execute.

Figure 1

Not Seeing Eye-to-Eye on BI


Which of the following statements best describes your organizations support for business intelligence investments/initiatives?

Both management and end users think business intelligence is a major asset and are generally supportive of new BI investments and initiatives. Management thinks BI is an asset, but we have difficulty getting end users to utilize the reports, dashboards and other tools we develop. My end users are hungry for reports and insight, but upper management has held back support or funding for new BI investments and initiatives. Neither management nor end users think BI is a major asset, so they arent supportive of business intelligence investments or initiatives.

38%

27%

28%

8%

Base: 242 respondents with responsibility for BI Data: Intelligent Enterprise/InformationWeek Analytics Enterprise Information and Application Priorities Survey of 305 business technology professionals

IEEnterprise_Applications_chart_9
4 April 2009 2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

Beyond the corporate world, in-memory BI tools allow state agencies and cities to stretch tax dollars further while improving services. For example, the Austin, Texas, fire department serves over 740,000 residents and responds to more than 200 calls a day. The department recently deployed QlikTechs QlikView to better analyze call response times, staffing levels and financial data. QlikTech is an in-memory analytic application vendor that has been growing rapidly in the last few years. With QlikView, users can get to data in new ways and perform what-if analysis, which the department says has helped in contract negotiations. And benefits go well beyond the fire department. Unless we spend more efficiently, costs for safety services will take a larger share of tax dollars, making less budget available for services such as libraries and parks, says Elizabeth Gray, a systems supervisor. Gray says that attendance and payroll data come from different systems and never seemed to make the priority list in the central data warehouse. With QlikView, we can access multiple data sources, from multiple platforms and different formats, she says. We can control transformations and business logic in the QlikView script and easily create a presentation layer that users love. In many cases, in-memory products such as QlikView and IBM Cognos TM1 have been deployed at the departmental level, because central IT has been too slow to respond to specific business requirements. A centralized data warehouse that has to accommodate an enterprises diverse requirements can have a longer time-to-value. Demand for in-memory is also strong among smaller companies that lack the resources or expertise to build a data warehouse; these products offer an ideal alternative because they can analyze vast quantities of data in memory and are a simpler, faster alternative to relational data marts. A number of the tools that use in-memory approaches facilitate a more exploratory, visual analysis. Vendors TIBCO Spotfire, Tableau Software and Advizor Solutions, for example, take an in-memory approach that offers a stark contrast to many query and OLAP products; instead of starting with a blank screen to build a query or a report, users start with a view of all the data. Held in memory, this data is then filtered down to just the information users are looking for with easy-to-use data selection, sliders, radio boxes and check boxes.

How It Works As the name suggests, the key difference between conventional BI tools and in-memory products is that the former query data on disk while the latter query data in random access memory (RAM). When a user runs a query against a typical data warehouse, for example, the query

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

normally goes to a database that reads the information from multiple tables stored on a servers hard disk (see Query Approaches Compared, below). With in-memory tools, all information is first loaded into memory. If the in-memory tool is server-based, an administrator may initiate the load process; if its a desktop analysis tool, the user may initiate the process on his or her workstation. Users then query and interact with data loaded into the machines memory. Accessing data in memory is literally millions of times faster than accessing data from disk. This is the real, speed-of-thought advantage that lives up to all the hyperbole. In-memory BI may sound like caching, a common approach to speeding query performance, but in-memory products dont suffer from the same limitations. Those caches are typically subsets of data, stored on and retrieved from disk (though some may load into RAM). The key difference is that the cached data is usually predefined and very specific, often to an individual query; but with in-memory tools, the data available for analysis is potentially as vast as an entire data mart.
Figure 2

Query Approaches Compared


Conventional Query Approach
Query Retrieve data from disk
S L O W

Results

Correct?

Format, some interaction

In-Memory Query Approach


Initial extract Retrieve data from DW Load into memory
Correct?

Format and interact

F A S T
Many BI technologies are compiled only to 32 bits, which means about 4 GB of addressable memory is available for both the BI application and the data. Microsoft first released its 64-bit version of Windows in 2005, and while adoption of the 64-bit OS is still modest, the implications for scalability and in-memory analytics are huge. With 64-bit processors and OSes, there is up to 64 GB of memory (Windows theoretical limit is 1 TB). While some BI applications will run on a 64-bit OS in compatibility mode, they must be compiled for a 64-bit OS in order to leverage the expanded addressable memory. Source: BI Scorecard

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

Another approach to solving query performance problems is for database administrators (DBAs) to analyze a query and then create indexes or aggregate tables in the relational database. When a query hits an aggregate table, which contains only a subset of the data, it may scan only a million records rather than the hundreds of millions of rows in a detailed fact table. Yet one more route to faster performance is to create a MOLAP (Multidimensional Online Analytical Processing) database. But whether its tuning or MOLAP, these paths to user-tolerable performance are laborious, time consuming and expensive. DBAs are often versed in performance tuning for transaction systems, but the ability to tune analytic queries is a rarer skill set often described as more art than science. Whats more, indexes and aggregate tables consume costly disk space. In-memory tools, in contrast, use techniques to store the data in highly compressed formats. Many vendors and practitioners cite a 1-to-10 data-volume ratio when comparing in-memory systems to traditional, on-disk storage. So while users benefit from lightning-fast queries, in-memory BI is also a big win for BI system administrators. Newell Rubbermaid, for example, says its BWA deployment has eliminated significant administrative time that was formerly required to tune queries. The queries were so fast [after the in-memory deployment], our users thought some data must have been missing, says Rajeev Kapur, director of business analytics at Newell Rubbermaid. Yet the performance improvement didnt involve analysis of access paths or creation of indexes or aggregate tables as were previously required.

Differences in Detail Taking a closer look at how conventional and in-memory products differ, QlikView is an inmemory product that employs what the vendor calls an associative approach. Unlike in conventional OLAP tools, data in QlikView does not have to be hierarchical. Instead, users select desired information and non-relevant information is automatically grayed out. This is a subtle difference, but its a big deal when compared with conventional query and OLAP tools. For example, all countries and all products were initially available as data filters in the dashboard pictured on the following page, but the moment Germany was selected, only products that sell in Germany (wheels, for example) appear as valid filters; products that dont sell in Germany (such as headsets) are grayed out. With most BI tools, only a subset of the data (the query result) is ever displayed, and users are forced to iteratively select, query and review sometimes invalid combinations of filters. In contrast, with all the data in-memory and displayed in this associative way, the analysis is faster and more intuitive.

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

Among the more data-visualization-oriented products, Tableau lets you explore data through a series of sliders, radio boxes and check boxes. The scatter plot diagram on page 14, for example, shows sales and profitability for new products, shown with an orange dot, and for other products, shown with a light gray dot. Users can hover over any dot to see the individFigure 3

A Faster Approach to Filtering Data

In this sales dashboard generated by QlikTechs QlikView software, available data is held in memory, and exploration is intuitive. When the country Germany is selected, the other countries are grayed out. States and Products that are not associated with Germany are also grayed out, so a user immediately sees which products are sold in Germany and which are not. The SQL queries employed by more conventional BI tools would be much more complex, and users would typiSource: QlikTech cally see just the final results, not the full range of possibilities.

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

ual values (as show by the red dot highlighted in this example). By clicking on this data point, users can also see all the underlying details of the record. This is similar in concept to an OLAP drill-through, except that the detailed data is held in memory and returned instantaneously. Another differentiator for in-memory tools is the way in which users can filter data. As shown in the scatter plot diagram, a series of quick filters are available to the left side of the user interface. The chart currently displays a subset of the data, with sales from 0 to 632 and profit from -415 to 3,174. To display more data, users can simply slide the ruler up or down and the chart is immediately redrawn. Likewise, users can display data from certain regions or product categories by selecting the appropriate checkboxes. In a conventional query tool, users would have to specify such filters, via a picklist or by manually entering the desired values, and then resubmit the query to the database and wait seconds, minutes or sometimes hours for a subset of the data to be returned. In some query tools, filtering of measures such as sales and profit is not even supported. This kind of immediate, interactive analysis is particularly important when people are trying to uncover unknown patterns or discover new opportunities. With conventional reporting tools, users typically have a specific question, such as What are my sales for this quarter. With inmemory visualizations tools, the question can be less specific and exploratory, akin to show me the data and the patterns in the data. Such is the case at Cleveland Clinic, a top-ranked academic hospital thats using Tableau software to analyze insurance claims. When an insurance company denies payments on a claim, it affects the hospitals profitability. Taking an interactive approach to analysis, Cleveland Clinic can proactively identify patterns in such denials.

Options Emerge In-memory products and capabilities are gaining momentum. TM1 (formerly Applix, which was acquired by IBM Cognos in 2007) was one of the early innovators of in-memory analytics in the 1980s. Beyond super-fast analysis, one of the compelling features of TM1 and a few other in-memory tools is the ability to perform what-if analysis on the fly. For example, users can input budget values or price increases to forecast future sales. The results are immediately available and held in memory. In contrast, most disk-based OLAP tools would require the database to be recalculated, either overnight or at the request of an administrator. Adoption of in-memory tools was initially held back by both the high cost of RAM and limited

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

scalability. Just a few years ago, 1 GB of RAM topped $150, whereas today it costs only about $35 compared. So a powerful analytics server with 64 GB of RAM that used to cost $64,000 today costs only about $13,000. The increasing prevalence of 64-bit operating systems (OSes) also makes in-memory analytics more scalable. Conventional 32-bit OSes offer only 4 GB of addressable memory, whereas 64-bit OSes support up to 1 TB of memory (as a reminder,
Figure 4

32-Bit and 64-Bit BI Deployments Compared


Up to 1 TB (1,024 GB)
More Users
Note: Bars are not proportional

Up to 4 GB
User Workspace 32-Bit Memory Limit Shared Cache Application Metadata Operating Systems

64-Bit Memory Limit

More Data
User Workspace Shared Cache Application Metadata Operating Systems

Faster Performance

Fewer Servers

Many BI technologies are compiled only to 32 bits, which means about 2 GB to 3 GB of addressable memory for both the BI application and the data. Microsoft first released its 64-bit version of Windows in 2005, and while the adoption of the 64-bit OS is still modest, the implications for scalability and in-memory analytics are huge. With 64 bit-processors and OSes, there is up 64 GB of memory (Windows theoretical limit is 1 TB). While some BI applications will run on a 64-bit OS in compatibility mode, for them to leverage all the addressable memory, they need to be compiled for the 64-bit OS. Recommended reading: Microsoft Support Note, 294418, Comparison of 32-bit and 64-bit memory architecture April 29, 2008. Source: MicroStrategy

10

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

1 TB = 1,024 GB). The impact of this difference is enormous. Once the OS, BI server and metadata of a typical BI system are loaded in a 32-bit environment (see 32-bit and 64-bit BI Deployments Compared), theres not much memory left for the data that users want to query and analyze. In a 64-bit operating system with 1 TB of addressable memory, many companies can load multiple data marts if not their entire data warehouse into RAM. Addressable RAM is certainly a key ingredient in the success of in-memory analytics, but adoption of 64-bit OSes will accelerate adoption exponentially. HP first released a 64-bit version of Unix in 1996, and Microsoft first released the 64-bit version of Windows in 2005. In the classic chicken-and-egg scenario, customers have been slow to adopt 64-bit OSes, so support from BI vendors has been mixed. Support for 64-bit OSes makes in-memory BI enterprise-scalable, but it is by no means a prerequisite to an in-memory deployment. Many customers are deploying in-memory BI tools on 32-bit OSes, but these systems are more likely to be geared to work groups and individuals working with smaller amounts of data. There are signs that adoption of 64-bit OSes and in-memory deployments are picking up steam. In-memory BI vendor QlikTech, for example, has reported sales growth rates of more than 80% per year over the last several years, while the BI markets overall growth over that same period averaged 10%. Whats more, even the largest, most well-established vendors are adding in-memory to their platforms. SAP, for example, released the Business Warehouse Accelerator in 2006. IBM Cognos acquired Applix and its TM1 product in 2007. In March of this year, MicroStrategy released MicroStrategy 9, which introduces an in-memory option as part of the platforms OLAP Services. Then there is the much-anticipated Project Gemini release from Microsoft. Due in 2010, Gemini is an enhancement to Excel and Analysis Services that will bring in-memory analytics to spreadsheet users, enabling users to slice and dice through millions of records with promised sub-second response times. Relational database vendors have also acquired in-memory capabilities. Oracle acquired TimesTen in 2005, and IBM acquired Solid DB in 2008. However, neither vendor has articulated a clear strategy on whether these technologies will be used only to speed transaction processing or also be used to speed data analysis.

11

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

Reality Checks In-memory technology certainly sounds like the wave of the future, but the tools do have their drawbacks. Here are seven in-memory caveats and characteristics to look out for: Does it play well with your warehouse and existing BI tools? While some vendors tout inmemory as a way of avoiding building a data warehouse, this option usually applies to smaller organizations that may only have a single source system. For larger companies that have multiple source systems, the data warehouse continues to be the ideal place to transform, model and cleanse the data for analysis.

Whos Who in In-Memory Business Intelligence


Vendor/Product
Advizor Solutions IBM Solid IBM Cognos TM1

Differentiator
Advanced visualization In-memory relational database In-memory OLAP Combination Excel add-in and Analysis Services enhancement (set for release 2010) Relational in-memory database In-memory option to BI platform

Sweet Spot
Interactive data visualization for discovering patterns and anomalies Speeding updates in transaction processing Budgeting and planning with flexible hierarchies and what-if analysis Excel users who want to interact with larger volumes of data as personal cubes that can be shared Speeding updates in transaction processing Existing MicroStrategy deployments experiencing slow query times Departments and workgroups that want highly interactive dashboards or small businesses that wish to avoid data warehouse deployments

Microsoft Gemini Oracle Times Ten MicroStrategy 9 OLAP Services

QlikTech QlikView

Associative in-memory analysis tool.

SAP BW Accelerator

Combination in-memory, column-store database Existing SAP BW deployments experiencing slow query times and appliance Advanced visualization Interactive data visualization for discovering patterns and anomalies Interactive data visualization for discovering patterns and anomalies

Tableau Software

TIBCO Spotfire

Advanced visualization

12

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

The degree to which in-memory products integrate with existing data warehouses is somewhat reflective of the vendors view of warehousing and what they are trying to accomplish. Both SAP BWA and MicroStrategy 9 OLAP Services, for example, are designed to integrate with existing BI environments. The primary benefit is to speed existing queries and make it possible to ask more complex queries. With some of the advanced visualization products, such as TIBCO Spotfire, an administrator creates a separate business model; the emphasis is on visual analysis rather than speeding existing BI queries. Tableau focuses on visual analysis, but it integrates with OLAP databases such as Oracle Hyperion Essbase and Microsoft Analysis Services and will recognize measures and dimensions from those products. IBM Cognos TM1 and QlikView load data from flat files and ODBC sources. Business rules and calculations are created within the applications. These products have their own viewers, and they are often deployed standalone from an existing data warehouse or OLAP deployment. Quick deployment time is part of their appeal, but their sometimes siloed nature can lead to yet another version of the truth, which can be their downfall. Is it designed for enterprise use? Lack of enterprise administrative features including usage monitoring, single sign-on and change management are common complaints about in-memory tools. Recognize these limitations, manage expectations, and deploy accordingly. Will data latency become a problem? Because the data is potentially extracted from a source system or a data warehouse and then loaded into memory, data latency can be a concern. Front-line workers in a customer service center, for example, need near-real-time, highly granular (detailed) data. If an in-memory tool contains last weeks product inventory data, its probably not of use to customer service reps. Thus, the suitability of an in-memory tool and the success of the deployment may hinge on the degree to which the vendor can automate data loads. Consider user and administrative impact. In-memory analytics tools often introduce some of the same concerns that OLAP stores create: namely, they usually create another data source, with its own calculations, business definitions and interface that users must learn and administrators must manage. Rather than simply formulating a business question, users also have to consider which data source and tool is most appropriate for their analyses. This is where tools such as SAP BWA and MicroStrategy 9 differ from other in-memory approaches: existing

13

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

Figure 5

Data Exploration With a Visual Difference

Data-visualization-oriented products let you explore data with intuitive sliders and check boxes. In this Tableau-generated scatter plot diagram, sales and profitability for "new" and "other" products are shown with orange dots and light gray dots, respectively. Users can hover over any dot to see individual values (as show next to the red dot). By adjusting the sliders to the left, users can reset parameters (from 0 to 632 for profit and -415 to 3174 for profit) and the data, held in memory, is immediately redrawn. Conventional query tools typically require that you specify filSource: Tableau ters in advance, resubmit new queries and wait seconds, minutes or even hours for new data.

14

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

queries, reports and dashboards automatically take advantage of in-memory analysis, seamless to users. Administrators are not adding calculations and business logic within another layer; they reside within the existing InfoCubes for SAP or project for MicroStrategy. Home improvement retailer Lowes beta tested MicroStrategy 9 and was pleasantly surprised that existing reports automatically leverage the in-memory cache. With hundreds of thousands of user-owned copies of existing reports, we had assumed users would have to recreate their personalized versions, says Stan Carman, solutions project manager. That was not the case. While these approaches integrate in the existing environment, they generally lack the visual discovery capabilities found in other in-memory products. Business users would like all of these technologies to converge, but the market isnt there yet. Consider support for Web-based deployment. Some in-memory tools are not nearly as Webenabled as their conventional BI counterparts. This seems to reflect both technology immaturity and a tendency to be a niche deployment. Conventional BI query tools have been deployed more widely, with Web-based deployments extending their reach. Many of the in-memory visualization products discussed in this article are primarily deployed in desktop authoring paradigms. Once built, applications can usually be deployed to a server and accessed via a Web browser. However, Web-based interactivity isnt likely to be as robust as the capabilities of the desktop application. For example, QlikView users cannot create new dashboards and calculations via the Web, and a Java applet is required for rich browser-based interactivity. Consider how widely you wish to deploy in-memory tools and evaluate Webbased versus thick-client capabilities. Gauge the costs and benefits of in-memory deployments. An in-memory deployment involves acquiring and supporting another tool and additional server infrastructure. In-memory investments often deliver great value, but the benefit may be hard to quantify and articulate. Businesses often arent aware of tangible, cost-of-ownership benefits such as saving disk space and reducing administrative labor that would otherwise be required to speed queries. Companies contemplating in-memory approaches would do well to raise the profile of these backroom tasks to better build the business case. BI teams that have oversold their existing

15

April 2009

2009 InformationWeek, Reproduction Prohibited

Insight at the Speed of Thought


InformationWeekanalytics.c om

N e x t - E r a B I Te c h C e n t e r

deployments will face tougher scrutiny in trying to bring in new but complementary tools. Didnt the data warehouse promise better insight? Didnt standard query tools promise faster insight? Positioning the tools with the user segments and types of applications is critical in the cost/benefit analysis and overall BI tool strategy.

Forge a Strategy In-memory analytics can help companies extend the reach of BI with minimal IT resources, but to take best advantage, heed the following advice: Understand the BI bottlenecks. Are users complaining about poor query response times? Do complex queries time out before finishing? Does poor performance prevent users from asking important business questions? If so, consider in-memory technology, but if available, look for tools that speed existing query performance without introducing an additional architecture. Identify the business opportunity and consider what users do with the data. If your BI deployment is report-oriented and does not facilitate what-if analysis, interactive filtering, discovery of patterns and new opportunities, then adding an in-memory visualization or OLAP tool may be beneficial. If users routinely dump data into spreadsheets for interactivity, it may be a warning sign that your BI environment is too inflexible and not aligned with the business requirements. Conversely, there are times when all users need is a report or a single number. Dont expect in-memory analytics to be the answer for everyone. Understand your scalability. For broad deployments accessing large data volumes, ensure that the vendor supports 64-bit deployments of your preferred operating system, whether thats Windows, Linux or Unix. Consider the importance of Web-based authoring and Web-based information consumption. Broad, distributed/extranet deployments demand rich Web-based interfaces and enterprise-class administrative features. As with most challenges in business intelligence, technology is only a part of the answer. The real value of in-memory BI is not only how fast it is, but more important, the decisions that can be enhanced, the tough business questions that can now be answered and the new opportunities that will be discovered.

16

April 2009

2009 InformationWeek, Reproduction Prohibited

You might also like