Вы находитесь на странице: 1из 10

Analytics exploration today and tomorrow

The evolution of analytics to meet with an organisations needs


February 2014

Existing analytics approaches meet point needs for specific roles within an organisation. However, how analysis is carried out, and the roles of people who will be carrying out the analysis, will need to adapt dynamically to meet the organisations changing needs. The analysis of a blend of multiple different data sources will help to add distinct business value to an organisation.

Clive Longbottom Quocirca Ltd Tel : +44 118 948 3360 Email: Clive.Longbottom@Quocirca.com

Rob Bamforth Quocirca Ltd Tel: +44 7802 175796 Email: Rob.Bamforth@Quocirca.com

Copyright Quocirca 2014

Executive Summary

Analytics exploration today and tomorrow


The evolution of analytics to meet with an organisations needs It is time to place analytics in the right hands
Data analytics should not only be in the hands of IT, nor even in the hands of data scientists or other data specialists. Todays organisations require intelligence from their data rapidly, which requires the analysis tools to be placed directly in the hands of the person needing the results. This requires not only an intuitive system, but also the capability to analyse a range of data sources, ranging from existing operational data stores to less structured information sources. Data by itself has no value. It needs to be analysed, filtered and reported on to become information. Information then needs to be turned into knowledge and only at that point can an effective decision be taken by pulling together all the knowledge that is available. Missing out on any key piece of knowledge can lead to the wrong decision and corresponding negative and potentially catastrophic impact on the business. Completely changing an existing environment to provide a new architecture and platform should not be required. Existing data stores must be blended along with new data types to enable new insights. Blending at as early a stage as possible allows the data-informationknowledge process to be compressed, allowing effective decisions to be made faster.

Effective decisions can only be made around complete data availability The key is to blend data sources as required

Reporting remains a problem

For many organisations, reporting against data remains a slow, static process. Reports still have to be created by skilled staff. They are run in batch mode against single data sources or against nominally integrated multiple databases, and the results are distributed as a Word, Excel, PowerPoint or PDF document. The reader cannot drill down into the data to see if all the decisions have been made against the right underlying data, or even see if all available data has been included. The world no longer runs purely on data held in formal databases. Documents, social networks, internet searches and other information sources must be taken into account when making decisions. Any new analytics and business intelligence platform must be capable of accepting a variety of data types and enabling these to be analysed and reported on swiftly and effectively.

Blending data sources requires a new approach

The future lies in enabling the value chain

An organisation can no longer define itself in isolation. It will have suppliers and customers, and each of these may also have their own suppliers and customers. The secure flow of intelligence up and down this value chain will be critical in defining which organisations succeed in tomorrows markets. Any analytics and business intelligence platform chosen must be able to embrace and blend data sources along the chain, and provide active reports that can be utilised by all necessary parts of the chain.

Conclusions
Business analytics has to be democratised to uncover its hidden value to the business. Blending different data types and sources ensures continued data veracity and strong value in results will help ensure that an organisation can compete effectively in the future. Less-structured data will be a core part of any future analysis process, and such sources will have to be included in the mix. The need for general employees to be able to analyse multiple data sources in their day-to-day work, and to securely and effectively share their findings along a value chain that extends beyond the organisation, has to be part of any system chosen.

Quocirca 2014

-2-

Analytics exploration today and tomorrow

Analysis a static solution?


The majority of data analytics carried out within organisations currently is based on the analysis of specific data sets created by specific applications. For example, a sales manager may want to run an analysis of performance against targets using their sales force automation (SFA) system; a marketing manager may want to see how a campaign performed using their customer relationship management (CRM) system. Few organisations are pulling together multiple data sets in order to run advanced analytics against a more coherent and full set of data. In the above case, it would make far more sense for the sales manager and the marketing manager to analyse their data sets in a complementary way. For example, in uncovering why a campaign failed, it may well have been because the sales force did not fully understand the offer. This would be apparent from the SFA system, but not available from analysing the CRM data. On top of this is the need to be able to include different data types. Combining the SFA and CRM data with less structured data from web searches or social network streams could give greater insights to the analysis it may well be that customers who would have been happy to buy the product or service offered through the campaign found that it couldnt be delivered quickly enough or was not available in the right size or colour, and that the potential customer had mentioned this in the best way they saw fit, through Facebook or Twitter. Without the mechanism for including such data, the analysis cannot provide the true insights users really need in order to serve the organisation more effectively. The mechanism for approaching analytics needs to change. More employees need to be empowered to access the data using suitable tools; they need to be able to pull together and blend disparate data streams and analyse them in an intuitive and visual manner. They need to pull out meaningful findings and distribute these as live reports, where others can drill down behind the data to ensure that the decisions made are fully supported by the available data and add other available sources. Data analysis needs to be broadened beyond technical specialists, such as IT and data scientists, and put in the hands of the general knowledge worker, providing greater capabilities across the whole business. This report will be of interest to those tasked with ensuring that a suitable data management and analysis platform is implemented within their organisation to provide a strategic, flexible, long-term analytics platform.

The decision pyramid


Data is just data until an action is carried out on it. The idea is to turn data into information in such a manner that knowledge can be gleaned to ensure the right decision can be made based on the information (see Figure 1). With first generation data analysis, this was relatively simple. An application created its own data in a single database, on top of which sat a report engine that could turn out pre-defined reports on a regular basis for users. This was fine at the time, but really only moved data up to the information stage. There was no real knowledge being gleaned, as the information was static without the ability to include other data sources. As such, the information being sent up to decision makers was limited and incomplete, often leading to less informed, and possibly damaging, decisions being made. Enterprise application integration (EAI), enterprise service buses (ESBs), master data management (MDM) and other approaches all attempted to pull together different applications. Massive data warehouses with cubes, marts and other architectures were put in place to try and improve the move from information to knowledge. However, without the right analysis capability over the top of all of this, extracting the knowledge was still problematic and slow, with decisions being made too late to really make the difference that the business required. Quocirca 2014 -3-

Analytics exploration today and tomorrow

Figure 1

A major change has been required for some time and the advent of more user-friendly, multi-data source analysis systems has created a more capable environment. Essentially, there has been a move to provide the following incremental capabilities: Business reporting the capability to provide insights into what has happened in the past Business analysis the capability to show insights into what is happening now Business intelligence the capability to show insights into what is likely to happen in the future The future is around business intelligence with a full capability for predictive analysis and using what if? scenarios. However, this requires changes to how IT and the business approach their data architecture, and the tools they choose for analysing the data sets. By bringing all needed data sources into play as early as possible, moving through the data-information-knowledge process can be speeded up, enabling effective decisions to be made far more rapidly, and so enabling better competitive moves in the market.

The inflexible report


Data analysis used to be a technical problem. To analyse the data, code and scripts had to be created that would query the data and then format the findings in such a way as to meet the needs of the user. In most cases, the user would not be technically adept enough to carry out the coding and scripting, so this task was given to the IT team. The IT team would spend time creating the report code and script and provide this back to the user. The user would run it and then find that the report wasnt really what they wanted and generated new questions the report did not answer. They would then have to go back to IT and ask for the code and scripts to be changed. This could go through many iterations, with the user generally settling for something that wasnt quite what they wanted, but was good enough. Now, many data analysis tools have moved to being far more dynamic, with the user being able to take a script-free approach to data analysis. Data can be more easily turned into visual reports with simple drag and drop techniques that are far more intuitive to users. In some cases, the reports created will be live they can be sent on to other

Quocirca 2014

-4-

Analytics exploration today and tomorrow


people so that they can change the views to meet their requirements, with the knowledge that they are still seeing reports against the exact same data set used by the original user. This has caused a sea change in how data analysis is viewed, with an increasing number of users finding they can improve their performance and decision making capabilities.

Creating a business intelligence platform


Creating a platform suitable for dealing with future data analysis needs requires looking at what big data means to an organisation. Using Quocircas 5 Vs definition, we see the need for dealing with: Volume just how much data is there that requires action? Variety what are the different types of data (relational, documents, image, voice, video, etc.) involved? Velocity how fast are the data assets being fed into the system, and how fast does the user require results? Veracity how good is the quality of the data being fed into the system? Value what value will the results provide to the user and the organisation? Covering the above will require a change in how many organisations have approached their data analysis in the past. For a start, basing a total approach on a relational database will not work. Forcing less-structured data assets such as files and voice into a relational database as binary large objects (BLObs) does not give the capabilities that are required for full business intelligence. A hybrid mix of data stores capable of dealing with structured and less-structured data will be required, along with the capability to span across these stores for analysis purposes. The solution chosen must also be able to operate across a range of source data stores, aggregating everything as required and minimising the volumes of data that will require actual analysis. This blending of different data sources must not be confused with existing approaches to data federation. Data federation is aimed at bringing together existing relational data sources through large data warehouses, and misses out on the extra value that can be brought in through the addition of less-structured information. Although existing data warehouse architectures still have a part to play, organisations should be thinking of how to break out from the constraints of a relational-only data world. Linking data at the source allows for greater audit and governance capabilities, as the data remains unchanged in itself and the reports and decisions made will always refer back to the source data, rather than to snapshots that have been extracted, transformed and loaded into a different data warehouse environment. However, this will not always be possible, in particular where external data is concerned (for example, the results of a web search or the use of an information store that is not under the direct control of the organisation). In these cases, a snapshot of the information will need to be taken and stored for ongoing analysis however, a link to show where the information came from should also be stored for audit purposes. For many, this will require adding Hadoop into the mix, along with a NoSQL data store. Hadoop can be used to provide data volume reduction, utilising MapReduce so that the resulting data being fed into the analysis is maintained at reasonable levels, minimising the need for investment in large additional data warehouses. To deal with the velocity aspect, full scalability within a high-speed environment will also be required. This can be provided through an in-memory analysis engine that loads the required data into fast memory for the analysis to take place. For the veracity aspect, data cleansing will be required. This may involve the use of polling data from multiple sources or using external services to ensure that addresses, for example, are correct and up to date. For value, a means of allowing the user to adequately define their needs and to state how valuable the findings will be both to them and the organisation will be required. Through this, analysis workloads can be prioritised and run so as to maximise the overall value to the business.

Quocirca 2014

-5-

Analytics exploration today and tomorrow

Case Study

St. Antonius Hospital


St. Antonius Hospital is a modern Dutch clinical training hospital split across six facilities in Utrecht en Nieuwegein. The hospital deals with over 547,000 in-patients and 50,000 outpatients per year.

Challenges
In order to improve patient care and reduce its facilities operating costs, St. Antonius Hospital needed better analysis of data for things such as emergency room waiting times and operating theatre occupation. With patient and research data trapped in separate silos for each hospital department, St. Antonius had a real challenge on its hands. In order to achieve a holistic view of both hospital activities and patients, St. Antonius needed a central data warehouse and business intelligence (BI) platform that would break down departmental silos and make data analysis available to the entire hospital staff. In addition, St. Antonius also needed to leverage the international High Level Seven (HL7) standards for healthcare data exchange and sharing both within hospital departments and central government reporting.

Solution
St. Antonius Hospital contracted with Tholis Consulting to help implement the hospital-wide BI project and train its staff in BI practices. A team sponsored and driven by St. Antonius CTO was set up to ensure th at benefits were achieved. The solution chosen uses Pentaho and has resulted in a series of systems: Data discovery and analysis Data is now provisioned and accessed by users throughout the hospital. Reporting Around 30 standard reports are available for direct use by a range of users. For example, a report of the waiting list for lung transplant patients can be easily accessed by doctors and administrators. Dashboards St. Antonius board of directors have access to a balanced scorecard for strategic planning and management, which will allow them to align hospital activities to the vision and strategy of the organisation, monitoring organisational performance against key goals. Data mining By working alongside St. Antonius existing R System statist ical tool, researchers can carry out analysis on issues such as lung patient survival rates. Mobile BI With doctors and administrators constantly on the go, the implementation of Pentaho will be extended to include mobile devices. Pentaho Data Integration Pentaho allows St Antonius staff to extract data from various internal and external sources and load them to their Data Vault. Although Pentaho created this for St Antonius, the hospital has kindly decided to donate it to the open source community so that any healthcare organisation can leverage it to overcome their own data integration challenges.

Results
St. Antonius has seen a 20% improvement in emergency room turnaround times. Data analysis is now in the hands of the doctors via self-service BI. The use of surgery rooms and personnel has been optimised through better visibility into such issues as the number of beds and operating theatres in use. Better research intelligence and preventative care through the use of data mining and predictive analysis. Easier and faster compliance with core central government reporting requirements. Lower costs and fewer resources the Pentaho system is overseen by a team of three people providing BI capabilities across the whole hospital.

Quocirca 2014

-6-

Analytics exploration today and tomorrow

A view into the future


Once a suitable platform is in place, the business can then start to reap the benefits. Rather than being constrained by inflexible reports that take ages to be modified and require IT input, users can be far more innovative in their approaches to analysing data. Rather than being stuck with a single view, they can look at data through different lenses maybe by using a heat map instead of a bar graph, or a surface plot rather than a pie chart. They can also carry out what if? analysis for example, by taking an existing view of historical data and creating a new report as to what would have likely happened if a different action had been taken. A/B analysis can be carried out in near-real time to identify which version of a process or campaign is working most effectively, enabling small changes to be made on a more continuous basis to ensure that the business maintains its competitiveness in its markets. Additional data sources can be pulled in to add to the analysis. With the internet being ubiquitous these days, there is a wealth of free and commercial information sources that organisations should look to include in their analysis. This ad-hoc blending of data sources will enable businesses to look outside of the constraints of the data sources that they own. Data held by suppliers, customers and logistics partners can be added in and utilised; commercial data such as that provided by Lexus Nexus, Dun & Bradstreet and Equifax can be added into the mix for direct analysis. Web searches and other less formatted information sources can be leveraged as well through the use of hybrid technology platforms mixing relational, schema-less NoSQL and Hadoop-based systems. The flexibility of the platform should allow individuals to work as they need not as they feel they are forced to by the technology. A view for a sales manager may not be adequate for a marketing manager, but they both need to be working against the same underlying data. The marketing manager may want to bring in, for example, the costs of carrying out a blended campaign across the web, email and TV, while the sales manager may want to see how the organisations competitors are doing across different geographies. By layering in extra data sources against the basic underlying data, new insights can be gained into the opportunities or issues the organisation will have. Where an external data source has an impact on the possible end decision, it must be captured and added to the underlying source data, so that when the results are forwarded on through the decision-making chain, the recipient can drill down as required to ensure that they fully understand how the current outcomes were decided upon.

Conclusions
Existing analytics approaches are struggling to meet an organisations needs. However, a complete forklift replacement of existing systems is not required. By building upon existing data stores and embracing these as feeds into a more flexible environment, organisations can build a next generation business intelligence platform while still utilising many of the skills built up in other areas. What is needed is a system that can accept a range of data sources that goes beyond the level of standard formal databases, including streamed, less structured data as well as big data sources. This blending of multiple different data sources and data types must be able to be carried out on the fly so as to provide the what if? scenario capabilities that will define those organisations that will be able to reap the financial rewards of a fully flexible BI system. However, this will also require changes in the way that the front-end of a BI system works. It has to be intuitive enough for general users to use, negating the need to go to IT every time a new report is required. It has to make access to different data types and sources quick and easy. It has to provide a range of ways of looking at the results so that different users can use a visualisation that makes sense to them. It has to allow multiple different visualisations to be created against the same underlying data set, and it must be able to allow reports to be portable in an active manner. Quocirca 2014 -7-

Analytics exploration today and tomorrow


A report that is sent around to different recipients must allow them to carry out their own analysis of what they see, through capabilities to drill down into the underlying data. This portability should not just be across the organisation itself, but should also allow for a secure means of sharing active reports across the value chain of suppliers and customers. All of this then leads to a change in mindset from a requirement for data specialists who spend their working life collating and analysing data sets and sending on results to business people, to supporting and empowering general workers in their day-to-day activities. Future winners in the markets will be defined through their ability to make the most of all the data sources available to them and how many of their employees and partners in the value chain can do so as well. Now is the time to create a business intelligence platform for the future one that can cope with the changes in data sources and types and can embrace new approaches over time.

Quocirca 2014

-8-

About Pentaho
Pentaho is delivering the future of business analytics. Pentaho's open source heritage drives its continued innovation in a modern, integrated, embeddable platform built for the future of analytics, including diverse and big data requirements. Powerful business analytics are made easy with Pentaho's cost-effective suite for data access, visualisation, integration, analysis and mining. For a free evaluation, download Pentaho Business Analytics at www.pentaho.com/get-started.

REPORT NOTE: This report has been written independently by Quocirca Ltd to provide an overview of the issues facing organisations seeking to maximise the effectiveness of todays dynamic workforce. The report draws on Quocircas extensive knowledge of the technology and business arenas, and provides advice on the approach that organisations should take to create a more effective and efficient environment for future growth.

About Quocirca
Quocirca is a primary research and analysis company specialising in the business impact of information technology and communications (ITC). With worldwide, native language reach, Quocirca provides in-depth insights into the views of buyers and influencers in large, mid-sized and small organisations. Its analyst team is made up of real-world practitioners with first-hand experience of ITC delivery who continuously research and track the industry and its real usage in the markets. Through researching perceptions, Quocirca uncovers the real hurdles to technology adoption the personal and political aspects of an organisations environment and the pressures of the need for demonstrable business value in any implementation. This capability to uncover and report back on the end-user perceptions in the market enables Quocirca to provide advice on the realities of technology adoption, not the promises.

Quocirca research is always pragmatic, business orientated and conducted in the context of the bigger picture. ITC has the ability to transform businesses and the processes that drive them, but often fails to do so. Quocircas mission is to help organisations improve their success rate in process enablement through better levels of understanding and the adoption of the correct technologies at the correct time. Quocirca has a pro-active primary research programme, regularly surveying users, purchasers and resellers of ITC products and services on emerging, evolving and maturing technologies. Over time, Quocirca has built a picture of long term investment trends, providing invaluable information for the whole of the ITC community. Quocirca works with global and local providers of ITC products and services to help them deliver on the promise that ITC holds for business. Quocircas clients include Oracle, IBM, CA, O2, T-Mobile, HP, Xerox, Ricoh and Symantec, along with other large and medium sized vendors, service providers and more specialist firms. Details of Quocircas work and the services it offers can be found at http://www.quocirca.com Disclaimer: This report has been written independently by Quocirca Ltd. During the preparation of this report, Quocirca may have used a number of sources for the information and views provided. Although Quocirca has attempted wherever possible to validate the information received from each vendor, Quocirca cannot be held responsible for any errors in information received in this manner. Although Quocirca has taken what steps it can to ensure that the information provided in this report is true and reflects real market conditions, Quocirca cannot take any responsibility for the ultimate reliability of the details presented. Therefore, Quocirca expressly disclaims all warranties and claims as to the validity of the data presented here, including any and all consequential losses incurred by any organisation or individual taking any action based on such data and advice. All brand and product names are recognised and acknowledged as trademarks or service marks of their respective holders.

Вам также может понравиться