You are on page 1of 72

Info-Tech Research Group 1

Vendor Landscape Plus: Data Integration Tools

Integration is the name of the game.
Info-Tech Research Group 2
Many enterprises are unfamiliar with the benefits of data integration tools.
Use this research to get a handle on how you can best integrate diverse
source data, and which software vendors will help you manage your data.
CIOs and IT managers of medium to large
firms who have informal Data Integration (DI)
Enterprises with 2+ major systems (e.g. CRM,
ERP), a document repository, and increasing
Analytics needs.
All data integration architectures.
Understand the capabilities of data integration
tools, and their potential use cases.
Differentiate between vendor offerings and
align potential solutions with your
organizations requirements.
Shortlist DI vendors, prepare an RFP, and
score RFP responses.
Develop an implementation strategy and
optimize your investment in data integration
This Research Is Designed For: This Research Will Help You:
Info-Tech Research Group 3
Info-Tech predicts that the tooling's capabilities will merge across all types
into single product family offerings within the next three to five years.
The integration space consists of data and process integration
as well as middleware; this set focuses on data integration
Reflects the convergence of
Enterprise Application
Integration (EAI), Data
Integration (DI) and Extract,
Transform, load (ETL) vendors.
Encompasses data
consolidation, federation,
propagation and access.
Is gaining momentum as
enterprises move from hand-
coding to tools.
Prevents silos of information
and enables an enterprise-wide
view of data.
Data Integration
Is workflow automation for
processes, e.g. online ordering.
Emphasizes a top-down process
design approach to a greater
extent than the other two,
starting from a process model.
Allows organizations to
streamline process and creates
opportunity for coordinated
Process Integration
Is used to connect enterprise
apps without transforming the
Creates access to data in
distributed architectures.
Requires advanced architecture
to achieve success.
SOA and Middleware
For information on
Application Integration Middleware,
refer to Info-Techs
Application Integration Middleware
Vendor Landscape
For information on
Application Integration Strategy,
refer to Info-Techs
Develop an Application Integration
For information on
Business Process Management,
refer to Info-Techs
Develop a Business Process
Management Strategy
Info-Tech Research Group 4
Executive Summary
Understand Data Integration Trends & Considerations
Evaluate Data Integration Vendor Offerings
Develop a Data Integration Implementation Strategy
DI tools are intended to free up developer resources by speeding up development, testing, and deployment.
Tools offer data federation, replication, and synchronization, not just ETL.
DI as a Service is ideal for the less mature who are ready to give up hand-coding.
Big Data and Analytics insights can provide valuable information to organizations. DI tools assist in managing new and
emerging types of data that are resulting from the data explosion.
Make the case to the CIO and the CFO and get budget approval for DI tools.
Platform standardization doesnt lock you in with that vendor for tools; looking elsewhere is often more cost-effective.
Choose your solution carefully as products evaluated offer a tremendous range of functionality and target different needs.
If gaining budget approval is difficult due to data quality issues; consider tooling with data profiling and cleansing
Prioritize integration, include people, process and technology. And align tool deployment with existing architecture.
Develop your timeline in preparation for integration.
Introduce training early and paint a compelling picture of process ease with tools to convince developers that change is
Develop a DI Tool Selection Strategy
Your data integration tool selection strategy is a function of your information architecture maturity, data architecture and
developer staff size.
DI tools arent for everyone, review your required features and determine which solution is right for you.
Not all vendor offerings are equal. Choose the right one to suit your needs.
Info-Tech Research Group 5
DI is no longer a nice to have but a need to have to enable
companies to effectively manage multiple data sources
Info-Tech evaluated eight competitors in the data
integration market, including the following notable
Informatica provides a unified platform that is a one-stop shop for
data integration.
IBM offers a highly capable and comprehensive integration
product that is backed by the flanking Sphere platform.
SAPs platform has evolved and matured into a comprehensive
portfolio with a strong product focus.
Value Award:
SAP offers the most affordable data integration tool solution with a
high-performing product at the absolute lowest price.
Innovation Award:
Talends open-source, subscription offering provides a quality
product at a reasonable price. They offer a free, downloadable
version for smaller companies, and a subscription based product
for larger enterprises.
1. Overcoming a hand-coding monopoly:
Developer resistance and prevalent in-
house hand-coding continue to be data
integrations greatest competitor even
though there are a number of viable
vendor tool solutions in the marketplace.
2. Its the Era of Big Data and Business
Intelligence (BI):
The benefits of exploiting BI and Big Data
insights are sweeping though companies
of all sizes. Data integration software has
come under scrutiny recently as a solution
to the data mining challenge.
3. The move towards unified platforms:
Tools dont just integrate anymore. Many
vendors are now incorporating a mix of
technologies such as Extract Transform
Load (ETL), data quality, replication,
metadata, and Master Data Management
(MDM) data federation into their DI tools
as they move their solutions towards a
single unified platform.
Info-Tech Insight
Info-Tech Research Group 6
The Info-Tech Data Integration Tools Vendor Landscape
Champions receive high scores for most
evaluation criteria and offer excellent value.
They have a strong market presence and
are usually the trend setters for the industry.
Innovators have demonstrated innovative
product strengths that act as their
competitive advantage in appealing to niche
segments of the market.
Market Pillars are established players with
very strong vendor credentials, but with
more average product scores.
Emerging players are newer vendors who
are starting to gain a foothold in the
marketplace. They balance product and
vendor attributes, though score lower
relative to market Champions.
For an explanation of how the Info-Tech Vendor Landscape is created, please see Vendor Evaluation Methodology in the appendices.
Info-Tech Research Group 7
Every vendor has its strengths & weaknesses;
pick the one that works best for you
Product Vendor
Features Usability
Viability Strategy Channel
Overall Overall
For an explanation of how the Info-Tech Harvey Balls are calculated please see Vendor Evaluation Methodology in the appendices.
Legend =Exemplary = Good = Adequate =Inadequate = Poor
Info-Tech Research Group 8
What is a Value Score?
The Data Integration Tools Value Index
The Value Score indexes each vendors product
offering and business strength relative to their
price point. It does not indicate vendor ranking.
Vendors that score high offer more bang-for-the-
buck (e.g. features, usability, stability, etc.) than
the average vendor, while the inverse is true for
those that score lower.
Price-conscious enterprises may wish to give the
Value Score more consideration than those who
are more focused on specific vendor/product
For an explanation of how the Info-Tech Value Index is calculated, please see Value Index Ranking Methodology in the appendices.
For an explanation of how normalized pricing is determined, please see Product Pricing Scenario & Methodology in the appendices.
Info-Tech Research Group 9
Table Stakes represent the minimum standard; without these
a product doesnt even get reviewed
If Table Stakes are all you need from your Data Integration Tool, the only true differentiator for the
organization is price. Otherwise, dig deeper to find the best price to value for your needs.
The products assessed in this Vendor
meet, at the very least, the
requirements outlined as Table Stakes.
Many of the vendors go above and beyond the
outlined Table Stakes, some even do so in
multiple categories. This section aims to highlight
the products capabilities in excess of the criteria
listed here.
The Table Stakes What Does This Mean?
Feature Description
Batch Integration Schedule integration processes to run on a
periodic basis to control the movement of
data from a source to target.
Extract, Transform and Load data from a
source to a target. Basic functionality for
date, string and numeric data manipulation.
Exception Reporting
and Notification
Ability to catch, report and handle
exceptions as they occur during
Metadata Management Ability to manage the metadata associated
with data stores in the enterprise.
Software allowing users to see the current
state of integration processes during
execution, and intercede if necessary.
Info-Tech Research Group 10
Advanced Features are the market differentiators that make or
break a product
Feature What We Looked For
Real Time Integration Ability to trigger integration processes in
near real time as data changes.
Data profiling, cleansing and reconciliation
across multiple sources, and subsequent
monitoring of new data creation.
Recovery of
Integration after
Ability to successfully rollback a
transaction, regardless of size or
Ability to see performance metrics of
executing processes without degradation.
Compatibility with industry leading data,
application and messaging middleware.
Ability to resolve semantic and context
conflicts between numerous data sources.
Synchronization and
Data Replication
Ability to accurately reflect data changes in
data across multiple data stores.
Advanced Features
Info-Tech scored each vendors features
offering as a summation of their
individual scores across the listed
advanced features. Vendors were given
one point for each feature the product
inherently provided. Some categories
were scored on a more granular scale
with vendors receiving half points.
Scoring Methodology
Info-Tech Research Group 11
Legend = Feature fully present = Feature partially present / pending = Feature unsatisfactory
Each vendor offers a different feature set; concentrate on what
you need
Real Time
Data Quality
after Failure
Info-Tech Research Group 12
Info-Tech Recommends:
The Informatica platform is a good fit for enterprises looking for an efficient, reliable, and easy to
manage data integration solution.
Informatica PowerCenter
Redwood City, CA
FY11 Revenue: $783.3M
Informatica is a focused data integration vendor that offers a
deep product portfolio
Unified data integration platform designed to access, integrate,
and manage any type of data on any processing platform.
Utilizes a set of data mapping, ETL, and information life-cycle
management tools that extend and scale to enterprise needs.
Newest release focuses on Big Data, MDM, data quality and
self-service tools. Features include support to pull in data from
social network feeds such as Twitter, Facebook and LinkedIn,
and a universal connector to the Hadoop file system.
Cloud offering provides the ability to move integration
workflows from Cloud to on-premise with no redevelopment
Platform historically focused on power users; Big Data and
complex integrations require the on-premise product rather
than the cloud offering.
The social media and Big Data connectors are sold separately
from the core platform.
3 Year TCO: Priced between $100K and $250K
Info-Tech Research Group 13
Info-Tech Recommends:
InfoSphere Information Sever offers great integration functionality and impressive extensibility into
other data domains.
InfoSphere Information Server
Armonk, NY
FY10 Revenue: $99.9B
IBM InfoSphere Information Server marches into the spotlight
with a solid product offering
Comprehensive platform that supports key enterprise
initiatives, such as warehousing, Big Data, MDM, and
information governance, and provides the ability to cleanse,
transform, and deliver information for immediate action.
SaaS offering through Amazon EC2.
Metadata driven platform provides information in real-time, on
demand, or in batch to the data warehouse for immediate
action regardless of location.
Advanced message handling supporting complex XML.
BI tools provide powerful data analysis for competitive insights
and advantage.
InfoSphere Information Server is a family of products that has
been integrated into a unified platform. However, each product
in the family may be sold separately, and may also require
additional installation and configuration to enable its
Difficult learning curve for larger, more complex integrations.
3 Year TCO: Priced between $250K and $500K
Info-Tech Research Group 14
Info-Tech Recommends:
BusinessObjects Data Integrator is an accessible, functional solution geared towards the SMB
BusinessObjects Data
Walldorf, Germany
FY10 Revenue: $16.8B
SAP is an established player in the industry and they offer a
solid product platform
Agile and intuitive environment that combines all data
integration and quality requirements, including development,
metadata and ETL functionality, and is fully web-services
enabled to support an SOA architecture.
Intuitive, codeless, drag-and-drop IDE rapidly develops
integration projects with the option to include data quality.
Transfer large quantities of data using parallelism, caching,
and grid computing approaches.
Trending and advanced text data processing analysis features.
Powerful data quality capabilities and support for unstructured
The SAP data integration technology was acquired from an
independent software vendor. Therefore, its roots are not SAP
centric and can be effective in non-SAP environments. Look to
the vendor to demonstrate examples of the technology
working effectively in non-SAP environments.
3 Year TCO: Priced between $50K and $100K
Info-Tech Research Group 15
Info-Tech Recommends:
Pervasives product is a strong offering. While the suite is not as hearty as other offerings, Galaxy and
Data Integrators add-on library are strong selling points for SMBs.
Data Integrator
Austin, TX
FY10 Revenue: $47.2M
Pervasive Data Integrator offers impressive price and
performance, especially for SMBs
Mature integration suite that provides transformation and flow
of almost any kind of data between sources throughout the
organization on a continuous, event-driven, or scheduled
basis in a full range of usage scenarios.
Offers intranet SaaS and a broad offering of connectors.
Highly configurable. Design and deploy on-premises or in the
cloud using reusable metadata stored in an open, XML-based
design repository.
IDE interfaces are feature-rich and easy to use and administer.
Galaxy exchange marketplace for previously custom only
integration products.
Platform is not as robust for high performance requirements
that come with large, complex projects, however it is ideal for
smaller companies that have smaller-scale data integration
3 Year TCO: Priced between $250K and $500K
Info-Tech Research Group 16
Info-Tech Recommends:
Talend is an open-source, scalable product that can handle data integration requirements in
organizations of all sizes.
Integration Solutions
Los Altos, CA & Suresnes,
Privately held company
Talend is ideal for enterprises that need to make the DI tool
business case with little or no budget
Subscription based, real-time, open source, integration
platform that uses ETL functionality for data integration,
cleansing, migration and synchronization, MDM, and business
Talend Cloud supports all forms of cloud-based computing.
A real-time data integration platform that supports multi-user
development and employs a unified data system for seamless
Advanced event-based scheduling, additional execution
features, error recovery management, and a full set of
connectivity adapters.
Doesnt have the bells & whistles of the larger offerings, but is
a viable option for smaller companies with repeatable DI tasks
who are constrained by budget.
Graphical User Interface is written in developer-speak so
there is a slight learning curve for less experienced users.
No hosted offering, but a full set of connectors is included.
3 Year TCO: Priced between $100K and $250K
Info-Tech Research Group 17
Info-Tech Recommends:
Oracle Data Integrator offers a ton of functionality for Oracle-aligned organizations. The platform is
best suited to those with complex data integration requirements.
Data Integrator
Redwood Shores, CA
FY10 Revenue: $2.99B
Oracle Data Integrator is a comprehensive solution, especially
when used to populate large data warehouses
Market Pillar
Unified solution that encompasses all data integration
requirements: from high-volume, high-performance batch
loads, to event-driven, real-time integration processes, to
SOA-enabled data services.
Advanced debugging, diagnoses, and error reporting.
Open, integrated ETL architecture delivers high-performance
data movement and transformation across complex systems.
Seamlessly integrates within an SOA infrastructure.
Enhanced connectivity to all third-party databases, data
warehouses and applications.
IDE contains mapping wizards and integrates with JDeveloper.
A comprehensive enterprise-level solution, but not designed
for smaller organizations that do not have high-level
integration needs.
Not as viable for non-Oracle shops; the platform works best
when used with other Oracle-supplied components.
Vendor Declined to Provide Pricing
Info-Tech Research Group 18
Info-Tech Recommends:
For SQL Server owners, SSIS is free, making it the most viable option, but be aware that life may not
be simpler for your hand-coders.
SQL Server Integration
Services (SSIS)
Redmond, WA
FY10 Revenue: $875.4M
Microsoft SSIS is an obvious choice for SQL Server users, but
it may not reduce hand-coding in the long run
Market Pillar
Offers a fully scalable enterprise-class data integration
platform that includes a high-performance ETL tool that
supports integration and workflow apps, and can extract and
transform data from a wide variety of sources.
Very strong BI functionality, particularly for populating Data
Warehouses and Data Marts.
High performance, customizable integration functionality that
scales according to need.
Includes a rich set of built-in tasks and transformations; works
as a stand-alone or in combination with other packages to
address complex integration requests.
There can be compatibility issues in non-Windows
Configuration, deployment, and maintenance can be difficult
and time consuming; steep learning curve.
Robust IDE features drag-and-drop functionality, but the need
for hand-coding has not been eliminated.
Vendor Declined to Provide Pricing
Info-Tech Research Group 19
Info-Tech Recommends:
An excellent choice for organizations that are seeking broad data management functionality.
However, if data integration is your only focus, you may want to look elsewhere.
SAS Enterprise Data
Integration Server, DataFlux
Data Management Platform
Cary, NC;
Privately held company
SAS/DataFlux takes on the industry giants with its unified
data management suite
Emerging Player
Single, integrated suite that manages everything from up-front
data discovery, MDM, business process integration, data
governance and federation, and event processing. The
platform encompasses batch, real-time, and virtual integration.
Unique platform that combines DataFlux and SAS
technologies; the framework bridges the gap between data
quality, profiling, monitoring, and ETL tools from a centralized
Data Management Studio provides extensive functionality for
developing rules and profiles that are reusable in other
projects; interface appeals to IT and business users.
Processing large volumes of data can be slow.
MDM component is fairly new, so that piece of the platform
may not be fully integrated.
SAS offering provides strong capability for data management
relative to BI, but the DataFlux offering lacks the BI focus.
3 Year TCO: Priced at 1M+*
*Vendor only provided list pricing, no discounts were
incorporated into the pricing scenario .
Info-Tech Research Group 20
Usability, debugging, robustness of tools, and developer appeal are key
factors to consider in the decision making process.
Rich Development Environments are key to the effectiveness
of your data integration tool
Platform Independence
Exemplary Performers
Viable Performers
Adequate Performers
Info-Tech Research Group 21
Are there a number of complementary products that can be used as stand-
alone products and scale to integrate within a unified solution when needed?
Consider vendor comprehensiveness when evaluating data
integration tools
Platform Independence
Exemplary Performers
Viable Performers
Adequate Performers
Info-Tech Research Group 22
Not everyone works in a stack environment. Avoid vendor lock-in by
selecting a solution that is platform, ERP, and middleware independent.
Ensure the vendors offering can be platform independent if
Platform Independence
Exemplary Performers
Viable Performers
Adequate Performers
Info-Tech Research Group 23
Data is an investment and every companys greatest asset; use
DI tooling to leverage that asset to get valuable returns
Dont just evaluate the financial ROI, but also the Return on Data. Info-Tech
research has concluded that DI tooling has measurable benefits.
Developer effectiveness
Development estimates
Documentation quality
Report accuracy
Data quality
Data and report accuracy are factors that can be improved early in the lifecycle of the tool, whereas code reuse
and development time will improve over time.
% of Survey respondents that reported
positive impacts of using DI tools
Using DI Tools Increases:
Using DI Tools Decreases:
N=117 Source: Info-Tech Research Group
Data Integration Errors
Hand Coding
DI is not an IT only initiative, it requires
collaboration with business to drive value and ensure
Info-Tech Research Group 24
Whats in this Section: Sections:
Understand DI Trends and Considerations
Understand Data Integration
Trends and Considerations
Develop a DI Tool Selection
Evaluate DI Vendors
Develop Your DI Implementation
Hand-coding and point-to-point integrations stifle growth.
DI tools are ready for primetime and address many data
management issues.
Investigate Data Integration as a Service (DIaaS) if you
have lighter integration needs.
DI tools help with Big Data, BI and development resources.
Make the case for budget approval.
Learn from your peers through case studies.
Info-Tech Research Group 25
Point-to-point integration stifles growth; stop hand-coding
your own demise
Point-to-point integration architectures cannot keep up with business growth
and represent weak links in mission critical integration scenarios.
One wrong change in
multiple integrations, can
bring down the entire
Tools can ensure your
changes dont lead to
data mismatches.
Rework, quality
problems, manual keying,
and endless rewriting are
needless costs.
Implement tools to
increase predictability
and data management
Hand-coding is unreliable
and hard to manage
across data architecture
Tools eliminate the need
to rekey, simplify rework,
offer quality and
Simple coding requests
frequently spiral into
tedious, never-ending
Automation in tools free
resources from cyclical
For an efficient IT shop, the tool is much more versatile and easier to maintain.
- Manager of Enterprise Services, Entertainment
2 4 6 8 10 12 14 16 18 20
P2P Interfaces
Tooling Interfaces
n # of Applications



Info-Tech Research Group 26
If you havent looked at tools lately you should look again
Tools offer integration functionality beyond traditional ETL*: data federation,
replication, synchronization, and design are increasingly appealing to
business users.
Data Integration Tools integrate and consolidate all data
types from internal and external sources into target
destinations such as databases, data-marts, data-
warehouses, and files.
Contrary to popular belief, these tools are no longer in
their infancy. A lot of tool functionality is now embedded.
Intuitive user interfaces have made them very easy to
Having a handful of tools to achieve your integration
needs is no longer necessary. They are now
consolidated and platform based so you can pick the
solution thats right for you.
Data is everywhere. Data is found across the enterprise in
disparate systems and formats (e.g. transactional systems,
legacy systems, spreadsheets, and flat files).
Big Data and BI insights are useful, but the challenge is
manipulating the volume of data for quality and usability.
Proliferation of isolated solutions, as different teams
create custom code for each integration project. This
degrades data quality, creates redundancy, impairs
reusability, and increases TCO due to the time and effort
needed to support multiple tools.
Mergers & acquisitions require tools to combine data from
disparate systems to eliminate the need to create custom-
code to extract data from each system.
Legacy systems are outdated and difficult to integrate.
Data integration tools have built-in connectivity capabilities
enabling extraction from many source technologies.
The Problems The Solution
*Extract, Transform, Load
Info-Tech Research Group 27
Youre probably integrating anyway, so why not use tools to do
it cheaper, faster, and better
Developers that are embracing tools improve their development process,
which leads to higher quality results, code re-use, and reliable estimates.
Creates consistent approach, simplifying
development and debugging.
No need to re-code to adapt function of
Manual re-keying not necessary.
Accommodates changes in the architecture.
Knowledge is out of developers heads, not lost
with turnover.
On-boards new developers faster.
Create metadata to enable BI.
Easier reporting.
Decreased data quality issues.
Shorter timelines and reduced complexity of
business projects.
Better access to data.
Benefits of Tools
Creates consistent approach, simplifying
Maintaining multiple skill sets and technologies.
Redevelopment, rework, integration updates , and
project times.
Debugging and exception handling development
Resource ramp-up time.
Data mapping and modeling time.
Costs Reduced
Vendor Solution
N=131 Source: Info-Tech Research Group
Two-thirds of survey respondents
are already using DI tools
Info-Tech Research Group 28
DI tooling improves IT delivery success rates on integration
Pitfalls of Poor Data Integration
Costs Customer Experience Marketing Operations
Hardware costs Customer churn Poor decision making Slow service delivery
Software costs Reactive decisions Poor positioning Slow time-to-market
Operational costs Poor marketing Poor investment Rigid
Opportunity costs Slow reaction times Strategy Labor-intensive
For hundreds of thousands of lines,
you cant do it by hand. Youre going
to miss something. If you use tools
its doing it all for you, and going to
save you hours of work.
- Data Warehouse Manager,
Distribution Services
Avoid the headaches. A recent Info-Tech survey shows that using DI
tools provides significant success rates in the following areas:
0 10 20 30 40 50 60
Development Process Consistency
Code reusability
Validity of Development Estimates
Quality of Documentation
Minimized Hand-Coding
N=117 Source: Info-Tech Research Group
Data integration tools may not be as
versatile as coding by hand, but they
provide a standard that can be easily
picked up by others and remove the
inconsistencies that are often
introduced by hand-coding.
- Info-Tech Survey Respondent
% of increased success due to tool usage
Info-Tech Research Group 29
DI as a Service (DIaaS) offerings are becoming more robust and offer a quick
ROI to organizations that are looking to give up hand-coding
If you havent made an investment in on-premise DI,
investigate DI as a Service as an alternative
Organizations who adopt SaaS solutions without a careful data management and integration strategy will
result in degraded Data Quality. DI as a Service offers the fast path to a solution.
Adopting SaaS introduces the need for integration,
and while custom coding is common and generally
feasible, it can be very resource intensive and difficult
to scale as SaaS applications multiply and more point-
to-point connections are demanded.
DIaaS, an on-demand offering itself, addresses SaaS
integration challenges. It is cloud-based and designed
to work with SaaS offerings, significantly reducing
implementation time and overall costs.
Many DIaaS vendors utilize pricing models designed
to scale from small to large organizations, e.g.
providing free development tools and charging
customers on a monthly, per-connection basis, rather
than license a major software package.
Consider DIaaS if you
Are adopting or other SaaS apps
Have Informal MDM process
Are intimidated by DI Tools
Operate in a SaaS-friendly industry
A number of industry experts are utilizing the acronym
IaaS in reference to Integration as a Service.
To reduce the confusion between Integration as a
Service and Infrastructure as a Service short forms,
Info-Tech refers to Data Integration as a Service as
Info-Tech Research Group 30
Big Data is big news; determine how large of an impact it will
have on your organization
Big Data, and how to manage it, is being hailed, or hyped, depending on your
opinion, as THE key IT strategy of the future.
While the size of Big Data is debatable, one thing is for certain; Big Data is only important to an organization
if it can be utilized to provide results and offer legitimate insights for future direction. Big Data only becomes
relevant when an organization understands what to do with it.
Big Data spans the three following areas:
Variety Big Data not only includes structured data,
but all types of unstructured data: text, email, audio,
video, click streams, log files and more.
Velocity Big Data tends to be time sensitive, so in
order to maximize its value, it must be captured as it is
streaming into the enterprise.
Volume Big Data only has one size: large.
Enterprises are flooded with data, easily accumulating
gigabytes, terabytes and sometimes petabytes of
Data volumes continue to expand exponentially
and the size and scope of this data makes it
difficult to manage with todays current
Substantial smartphone and social network usage,
along with increasing use of video and media files
are creating enormous amounts of structured
and unstructured data.
Many organizations are finding that they need to
keep more data longer to meet legal and
regulatory compliance, which just adds to the ever-
growing pile.
Enterprises are now facing significant
technology challenges in managing this data
Big Data isnt going away. It will continue to impact
the IT world and organizations should begin to
consider the implications of Big Data.
Info-Tech Research Group 31
Decide if Big Data will influence your Data Integration needs
In order to take advantage of the insights hidden within Big Data, IT will create a management architecture
that guarantees effortless data flow between systems. Data integration, middleware and business
process flow technologies will become critical platforms to support this evolution.
While Hadoop, Pig, and NoSQL are garnering a lot
of attention, the management of Big Data is still
evolving. However, a number of DI vendors have
begun to incorporate Big Data solutions within their
Blogs, social media, smartphone usage, and
enterprise applications are producing a mountain of
data that, when properly handled and analyzed,
can assist organizations to uncover concealed
opportunities that werent recognized in the past.
These Big Data insights have to be guided by real-
time predictive intelligence in order to be useful.
However, most IT departments do not currently
employ the right talent to support a Big Data
strategy. If your organization has chosen to go
down the Big Data path, be prepared to hire or
retrain staff to support the skill set that Big Data
Do you have a Big Data problem, or merely a
data problem? If youve simply run out of
storage, processing power, backup window, or
network capacity, it may be a classic data
problem. DI tools can help to optimize your
existing data management.
Have you optimized the data into tiers for
performance? As you optimize storage
performance vs. cost, your DI tools can help
by getting data in and out more quickly.
Big Data has one critical attribute: it will get
much bigger very quickly. The right DI tool will
support rapidly evolving needs that may be
unforeseen during product selection.
When considering DI technology for Big Data,
consider these factors:
Info-Tech Research Group 32
Enterprise Business Intelligence (BI) depends on Data
Integration for quality data
Overall Data Quality

BI success is
directly related to
the quality of the
underlying data
Data Quality is a lack of data conflicts, data duplication,
outdated data, incomplete data, and inaccurate data.
N = 41 Source: Info-Tech Research Group
Organizations are now mining huge data sets and
using BI-driven insights, however BI solutions are
often still constrained by data quality issues.
BI is only as useful as the underlying data. With
data coming from numerous sources, data
cleansing has become a primary goal of the data
integration process. ITs role is shifting from
developer of the reports to provider of usable and
useful data. IT departments need to ensure
continual data quality as BI increases its role in
decision making.
Data integration is a core component of any
enterprise BI solution. While a single data source
may contain high quality data, that quality will
degrade when multiple data sets are simply
aggregated. Data Integration tools are key to the
ongoing aggregation of data sources used by
Business Intelligence technologies.
In todays competitive business environment, BI and Analytics are expected to drive business results in
areas where the bottom line can easily be measured.
Info-Tech Research Group 33
Getting budget approval is tough, and it can be even tougher if
the value of the tooling is not fully understood
Successful data integration initiatives are a source of competitive advantage,
allowing enterprises to reduce costs, derive insights and out-perform
Opportunities are missed for tool adoption
because of failure to make the business case and
gain budget approval.
Realizing the business and IT benefits of tools
requires an ability to justify the costs of a data
integration tool to the CFO or budget authority.
Many vendors offer a family of products in which
an organization can start small and grow. This
may result in a lower up front investment and a
better chance to gain budget approval.
If budget approval is a real challenge, consider
looking at DIaaS versions of the vendors product
offerings that may offer a lower cost solution and
subsequently, a better chance of budget approval.
Organizations with poor data quality have difficulty getting budget approval. Integrating bad data results in more
bad data, and the cost of cleaning up data before attempting to integrate it is prohibitive. DI tools can help
organizations improve their data quality and therefore should be a worthwhile investment where this is an issue.
Tools aren't
N=84 Source: Info-Tech Research Group
Budget Approval is the Highest
Friction Point to DI Adoption


Info-Tech Research Group 34
DI tools pay for themselves by freeing up development
Look beyond resources; tools offer hard cost savings
from decreased operational development and
maintenance costs.
The cost of licensing a tool could be anywhere from
zero* to $500,000 depending on the product and your
For the mid-market, several recommended options are
available in the $30-$50,000 range.
This does not include:
Implementation costs
Support - 10-15% of licensing cost
Developer training - $2000/developer
Purchase of extra application licenses
Any new hardware required
Our cost is in the six figures annually just for
the integration point. Its excessive, theres no
question about it.
- CIO, Construction
Using a tool does not mean firing developers.
Developers can often be used for more directly
beneficial business functions when Data Integration
costs go down.
Data Integration tools can reduce the skill set
requirements for developers, making it possible to
reduce staffing costs over time.
Hard costs wont always justify tool purchases. But soft savings from a
faster, easier process often validate the decision.
Often at six-figure salaries, developers can cost you more than many tools on the market today.
*Talend offers a free open-source data integration tool, though Info-Tech does not recommend that version
for enterprise use.
Info-Tech Research Group 35
Mid-sized insurance firm selecting a data integration tool to
decrease business order time.
Case Study: Thirty days is great, but how about twenty
Financial Services
Enterprise applications,
including are
integrated through hand-coding,
resulting in high development
times and effort.
Need an FTE to run every
integration batch manually.
Brought in consultants to
document the business case for
integration tools.
Presented case to management
and developers, gaining full
Business order times will be
reduced to 20 minutes from 30
days because tooling allowed
for a more flexible architecture
and near real-time integration
Client Website will be
dramatically more useable.
Eliminated one FTE, saving
over $200,000 annually.
One of the findings was if we wanted
to just keep our company where it is
today then we dont need to do
anything. If you want to grow, then
we have to do something.
- IT Director, Insurance
Info-Tech Research Group 36
Master Data Management is beyond the reach of many
companies because of cost, but DI can enable a simple
Case Study: DI tool as the enabler of a virtual Master Data
Management strategy
Office Furnishings
A mid-sized manufacturer of
office furnishings had unreliable
analytics after their CRM system
was migrated to a cloud-based
To encourage adoption of the
new system, UI edit rules were
relaxed and restrictions on
account record creation were
The relationships between
customer, financial, and product
data became unstable.
Situation: Unreliable Data
The IT department decided on a
Virtual Master Data
Management strategy based on
their Data Integration tool.
The rules-driven solution uses a
registry model for MDM as the
DI tool dynamically queries
multiple data sources to create a
real time high-quality data
A small amount of human
intervention is required daily to
resolve data conflicts.
Action: Virtual MDM
After a relatively small
development effort, the IT
department produced a reliable
analytics data source for
marketing and product
development users.
People throughout the
organization did not have to
change their silo mentality. The
CRM and Supply Chain
Management systems stayed
as-is, optimized for their users
rather than for data quality.
Results: Improved BI
Info-Tech Research Group 37
Employ DI Tools to measurably improve efficiency and
operational success
Success divides into two main statistical factors, but you can skip the heavy
math. Efficiency and operational factors are significantly improved when
integration tools are adopted.
Organizations using tools rather than
hand-coding for data integration are
11% more successful when it comes
to operational success
Compared to organizations that hand-
code, organizations that use tools are
10% more successful when it comes
to efficiency of data integration
Impact of Data Integration Tools
Efficiency Success
Operational Success
Development Time MDM Structure
Process Efficiency
Process Satisfaction
Solution Satisfaction
Code Reusability Process Predictability
Data Accuracy
Cost Effectiveness
N=63 Source: Info-Tech Research Group
Info-Tech Research Group 38
Whats in this Section: Sections:
Develop a DI Tool Selection Strategy
Understand Data Integration
Trends and Considerations
Develop a DI Tool Selection
Evaluate DI Vendors
Develop Your DI Implementation
Decide if DI tools are right for you.
Determine which features you need.
Decide on your licensing model.
Select the right vendor for appropriate tool functionality.
Info-Tech Research Group 39
Youve heard the pitch, now decide if tools are right for you
Data Integration Tools arent for everyone, and within the space there are
many options for many issues. Determine which, if any, are right for you.
Consideration Build Buy
Initial Investment Lower Higher
Operations Cost Higher Lower
Support &
In-house IT Vendor-managed
In-house IT Staff Skill Requires high
Less required
Data Cleansing Limited Often included
Metadata Capture &
Impact Analysis
Limited Often included
Data Source Changes IT effort Vendor-managed
supports user code
One-Time Integrations Ideal Overkill
Based on your answers to a series of questions about
your environment, the Info-Tech Data Integration Tool
Appropriateness Assessment will recommend an
appropriate strategy for your organization.
Info-Tech Research Group 40
Determine which data integration features you need
Most data integration tool
suites encompass all listed
features to varying degrees.
Selection Tips
Use To When
Data Migration/ Conversion
Migrate/ convert data in bulk to a new
A source system is renewed or because of
application consolidation.
Batch Integration Tool Process data changes in bulk at
scheduled intervals.
Loading data marts/ warehouses and application
Real Time Integration Tool Update data in multiple systems as it
is changed.
Requirements necessitate data be updated in
real time operational dashboards, etc.
Metadata Management Tool Track data lineage (source, history,
Transforming/consolidating data from multiple
sources into a data store, (warehouse or mart).
Data Quality Tool Cleanse and consolidate conflicting
Cleansing/consolidating within the same
application or as part of a transformation routine.
Data Modeler Create well-formed data structures
that respect integrity.
Building/revising a data architecture for an
application or enterprise structure.
Use your feature requirements
as a guide as you develop
your data integration tool
Selection Tips
Purchasing a full suite may be
more costly in the short term but
the level of support and
functionality is higher than
with cheap and easy tools.
Selection Tips
Info-Tech Research Group 41
Four factors govern your data integration tool selection
Developing a strategy is an art, not a science; use these factors to direct your
selection, but dont expect any vendor to deliver a silver bullet.
How well do you understand your
enterprises data flow?
Determine which tools will be
comfortable for your current level
or take you to the next one.
Answer So you can
Data Architecture
Are you using point-to-point
integrations or have you moved
to a hub?
Determine which tools help you
develop your architecture, or just
make your life easier.
The data integration tool umbrella
encompasses many features.
Which do you need?
Determine which tools offer you
the need to have, not the nice
to have features.
Number of
How many developers will be
working with the data integration
Determine which licensing
models work best for your
Info-Tech Research Group 42
When you are ready to take a more holistic approach to
integration use tools
DI tool adoption should be a function of information architecture maturity.
If you have any of the following scenarios, tools are for you.

I nf or mat i on Ar chi t ect ur e Mat ur i t y
No MDM processes or
tools are in place. Data
integrations are fully
Data flow is understood
and documented but no
formal system exists.
MDM is part of a BI, DI,
or EII suite. Data is
captured when moving
into a BI report or
Enterprise data modeling
every piece of data is
accounted for in a data
You are likely already
using advanced DI tools.
Good job!
Start developing an
understanding of your
data integration needs
before jumping into
Pick an architecture
based on your
processing load
bottleneck and select
a tool to support it.
Use tools to achieve
application integration
before it hits central hub.
Informal Lite Enterprise
Data Integration is necessary for providing consolidated and accurate data for decision making. Without
integration tools, MDM and BI initiatives will fail.
Info-Tech Research Group 43
Use your developer team size to pick the licensing model thats
right for you
Lower numbers of developers may be less costly to license by seat than a
server/processor based licensing model
1. Pick the Right License 2. Minimize Tool Count
Per Developer Licensing:
Per Processor Licensing:
Tools designed for smaller shops typically use per
developer licensing. All else equal, these are
typically most cost effective for shops with four or
less integration developers.
Stack tools, typically targeting enterprise clients, license
per processor. Measures of enterprise size aside, these
typically become cost-effective if they will be used by
more than four integration developers.
Having a handful of tools to achieve your integration needs
is no longer necessary. They are now consolidated and
platform based so you can pick the stack thats right for
Standardize and consolidate tools to drive down costs.
Paying for software licenses and support for multiple tools
is much less cost effective than purchasing a robust tool or
Consider what tools offer in terms of price-to-performance.
It is better to invest in one flexible tool than purchase
several as needed, which inflates software costs.
If up-front costs are too high to make tool
use feasible, dont give up!
Check out DI as a Service
With market consolidation, a single tool may
now be used for data integration, migration, data
quality, and even MDM.
Info-Tech Research Group 44
Not all vendors are created equal; pick the right one for your
While tool satisfaction is uniformly above hand-
coding, it ranges dramatically across stack vendors.
Reduction in hand-coding is also variable, with some
tools failing to provide a significant reduction at all.
I want Info-Tech Recommends
To reduce hand-coding without abandoning point-to-point Informatica Cloud
To get tools as cheaply as possible Talend, Informatica Cloud, Pervasive Data Integrator
Stick with a stack vendor SAP BusinessObjects Data Integrator, Oracle Data
Integrator, IBM InfoSphere, Microsoft SSIS
Integration with a comprehensive data management platform DataFlux , IBM InfoSphere
To try before I buy to help make the business case Talend
To improve my SQL Server data integration Microsoft SSIS
Effectiveness is highly vendor dependent
Gaining Budget
Tools aren't
Robust Enough
A recent Info-Tech survey indicates that almost 50% of
respondents cite selecting a standard vendor platform as
one of the primary inhibitors of DI tool success.
N=84 Source: Info-Tech Research Group


Primary Inhibitors of DI Tool Success
Info-Tech Research Group 45
Whats in this Section: Sections:
Evaluate Data Integration Tools Vendors
Understand Data Integration
Trends and Considerations
Develop a DI Tool Selection
Evaluate DI Vendors
Develop Your DI Implementation
Info-Techs Vendor Landscape for eight DI Tool vendors.
Shortlisting DI Tool vendors through scenario analysis.
Developing and executing a DI Tool RFP.
Info-Tech Research Group 46
Market Overview
Data integration continues to be an evolving practice even
though it came into existence more than 30 years ago, when
computing became distributed. The objective of data
integration is to collect data from numerous, different sources,
merge it, and display it to the user in such a way that it
appears to be a unified whole.
The need for data integration arose from mergers,
acquisitions, and organizations maintaining numerous
databases and applications that housed different data
aspects necessary to business operations. For many years,
organizations tried to consolidate their data into one system
by relying on developers to hand-code scripts to connect the
various sources, but that practice often resulted in quality
issues, lengthy development times, and unreliability one
error in a line of code can bring an entire system down.
Data integration isn't just a component of an IT project
anymore. It has become a core practice that requires
consideration from the start if an organization is going to
make the most use of their data.
Data integration tools have become a hot commodity in
recent years. With BI and Big Data analysis moving to the
forefront in organizational strategy, DI tools are becoming
geared towards business stakeholders as well as IT
developers. Rich development environments, reporting
analytics, and efficient data management are becoming
key factors in the design of DI tools.
The integration area currently consists of middleware, data
and process integration. As DI solutions move towards
becoming a comprehensive, unified platform, the lines
distinguishing these areas will become increasingly
In the future, as data integration processes evolve to meet
the growing needs of the organization, they will require the
capacity to accommodate a broader range of data types.
Increasing use of SaaS will require DI to reside in the
cloud. Organizations wishing to capture social networking
data will require their DI solutions to integrate with various
media feeds such as Twitter and Facebook.
How it got here Where its going
As the market evolves, capabilities that were once cutting-edge become default, and new functionality
becomes differentiating. Batch integration has become a Table Stakes capability and should no longer be
used to differentiate solutions. Instead, focus on data quality and real-time integration to get the best fit for
your requirements.
Info-Tech Research Group 47
DI tools Vendor Landscape selection/knock-out criteria:
market share, mind share, and platform unification
SAS/DataFlux: The platform is a unified design, development and execution DI solution that enables data
quality, integration and master data management (MDM) from a single interface.
IBM: InfoSphere Information Server is a comprehensive platform that provides seamless data integration to
support initiatives across MDM, data warehousing, Big Data and migration projects.
Informatica: An inclusive, unified, open software platform to access, integrate, and manage any type of data on
any processing platform.
Microsoft: SQL Server Integration Services (SSIS) platform features a flexible data warehousing tool that aids
in data integration and workflow applications.
Oracle: Data Integrator is a broad integration platform that addresses all data integration requirements,
including seamless batch and real-time integration and data warehousing.
Pervasive: Data Integrator is a highly configurable integration platform that performs extraction,
transformation and flow of nearly any kind of data between sources throughout the organization.
SAP: BusinessObjects allows the organization to profile, extract, transform, and move data in real time and at
any interval anywhere across the enterprise.
Talend: Provides open source, subscription based data integration solutions designed for business intelligence
and data migration and synchronization.
Included in the Vendor Landscape:
Until recently, the data integration market was a fairly conservative space with most solutions only offering ETL
functionality. However, changes to vendor offerings, with the inclusion of data quality (profiling, analysis and
cleansing), synchronization, and MDM are bringing about a movement towards unified platforms in DI technologies.
For this Vendor Landscape, Info-Tech focused on those vendors that have a strong market presence and/or reputational
presence among small to mid-sized enterprises.
Info-Tech Research Group 48
Data Integration Tools Criteria & Weighting Factors
20% 20%
Affordability Architecture
Vendor Evaluation
Vendor is committed to the space and has a
future product and portfolio roadmap.
Vendor offers global coverage and is able to
sell and provide post-sales support.
Vendor is profitable, knowledgeable, and will
be around for the long-term.
Vendor channel strategy is appropriate and the
channels themselves are strong.
Product Evaluation
The solutions dashboard and reporting tools
are intuitive and easy to use.
The delivery method of the solution aligns with
what is expected within the space.
The five year TCO of the solution is
The solution provides basic
and advanced feature/functionality.
Reach Channel
Info-Tech Research Group 49
The Info-Tech Data Integration Tool Vendor Shortlist tool is designed to
generate a customized shortlist of vendors based on your key priorities.
Identify leading candidates with the Data Integration Tool
Vendor Shortlist tool
Overall Vendor vs. Product Weightings
Top-level weighting of product vs. vendor
Individual product criteria weightings:
Individual vendor criteria weightings:
This tool offers the ability to modify:
Custom Vendor Landscape and Vendor Shortlist
Your customized Vendor Shortlist is sorted based on the priorities identified on the Data Entry tab. Scores are calculated using
the Client Weightings and the assigned Info-Tech Vendor Landscape scores. Vendors are ranked based on the computed
Average Score. The Average Score is the average of the weighted average Vendor Score and the weighted average Product
Score. A custom Vendor Landscape has been generated as well, plotting the weighted average Vendor Score against the
weighted average Product Score.
Custom Vendor Landscape for [Enterprise Name Here]
Info-Tech Research Group 50
Issue an RFP to ensure that Data Integration Tools vendors fit
your needs, and not the other way around
An RFP implies stable requirements and an intent to buy use this tool to help select a supplier, not to
develop a shortlist.
Use Info-Techs Data Integration Solution RFP Template to conduct this
critical step in your vendor selection process.
The Statement of Work
Proposal Preparation Instructions
Scope of Work
Functional Requirements
Technical Specifications
Operations & Support
Sizing & Implementation
Vendor Qualifications & References
Budget & Estimated Pricing
Vendor Certification
Info-Techs DI Tools RFP Template is
populated with critical elements, including:
Info-Tech Research Group 51
To get the most value out of the RFP process, use the
Data Integration Tools RFP Scoring Tool
A standard and transparent process for scoring individual vendor RFP
responses will help ensure that internal team biases are minimized.
The Info-Tech Data Integration Solution
Evaluation & RFP Response Tool comes
pre-built with important scoring criteria for
vendor RFP responses.
This tool includes modifiable criteria across
the following categories:
Features (real-time integration)
Operational Requirements
(debugging, exception reporting)
Architecture (hosted deployment,
connector volume)
Adjust the individual category weightings to
customize this tool to business priorities.
Use Info-Techs
DI Tools RFP Scoring Tool to:
Info-Tech Research Group 52
Take charge of vendor finalist demonstrations with a
Vendor Demo Script
An on-site product demonstration will help enterprise decision-makers better
understand the capabilities and constraints of various solutions.
The Info-Tech Data Integration Vendor
Demonstration Script is designed to provide
vendors with a consistent set of instructions
for key scenarios.
This template includes examples for the
following scenarios:
Planning & Deployment
Meeting Setup & Operation
Software Installation
Initial Configuration
Adjust the individual category weightings to
customize this tool to business priorities.
This tool is designed to provide vendors
with a consistent set of instructions for
demonstrating key scenarios for the DI
tools implementation.
Info-Tech Research Group 53
Whats in this Section: Sections:
Develop Your DI Tool Implementation Strategy
Understand Data Integration
Trends and Considerations
Develop a DI Tool Selection
Evaluate DI Vendors
Develop Your DI
Implementation Strategy
Include people, process, and technology in your
implementation preparation.
Align tool deployment with existing architecture.
Determine which tool attributes will drive your tool solution.
Develop your timeline in preparation for integration.
Introduce training early to avoid resistance.
Info-Tech Research Group 54
Prepare for implementation to ensure a smooth process
Storing duplicate and unneeded data is
expensive; the cost to integrate or
migrate it is worse. Remove this data
before integration to drive down costs
and the timeframe.
For straightforward implementations,
most enterprises should be able to
proceed with their own staff or
resources provided by their selected
vendor or channel partners.
Complex, large scale integrations, or
enterprises with very limited IT staff
may benefit from introducing
consultants as soon as possible.
1. Manage the Technology 2. Dont Forget People & Processes
Flexibility is King
Where possible, reduce future costs by
selecting a tool that is reusable and
replace other single-function tools to
minimize licensing and support. Consider
present and future needs.
When possible, automate
The closer your tool aligns with your
desired functionality, the less hand-coding
you will have to do, so capturing precise
technical requirements is critical. Properly
chosen, data integration tools are flexible
and can lend themselves to complex
integration scenarios.
Prune data first
Integration size, complexity,
and available internal staff
determine consulting need.
The goal of any data
integration tool should
be to make source
data systems plug n
play. Keep this in mind
when developing your
process, as processes
that dont enable this
will require on-going
management, limiting
cost and time savings,
and causing additional
Info-Tech Insight
Info-Tech Research Group 55
Look to your existing data architecture to direct your tool
implementation architecture
DI tools fit into multiple data architectures. Align your tool deployment with
your existing data architecture to derive maximum benefit.

Tightly coupled
connections between
software applications.
Centralized architecture
that uses a data store
(hub) to populate
systems (spoke).
The data store does not
actually store any data
only rules to direct and
coordinate data between
Uses agents positioned
between the system and
data store to reduce
system processing load.
Start with lighter
solutions, such as DI as
a Service options to
streamline the process.
Point-to-Point Hub and Spoke Federated Distributed
Most DI Tools support all three structures. If you are looking to upgrade to
one of the structures, tools will enable and simplify this process. The decision to
pick a hub, rules engine, or intermediaries will depend on where in your system
you would like to process load.
Info-Tech Research Group 56
Not all DI projects are created equal, but they share common
attributes that may drive different solutions
Real-time or
Though real-time and online integration is supplanting batch-oriented, this is not always
appropriate and can significantly increase costs and bog down performance. Unless up-
to-date real-time data will drive business value, batch integration remains entirely
How do I address
huge data flow?
More advanced tools may provide parallel processing and workload balancing. Both
are useful for enterprises with large data volumes and overcome performance
issues stemming from bottlenecks in peak frequency times.
What about Data
Dont overlook data quality; it is key to data integration success, as integrating bad
data is ineffective. Many offerings include this functionality directly or through a third
party extension; in most cases the difference is negligible. Keep in mind the ratio of cost-
to-effectiveness, as 100% accuracy is rarely worth the investment.
Whats Impact
Impact analysis (i.e. tracing the impact of source data changes on reports or analysis)
requires metadata integration and data lineage and allows for development of what-if
scenarios. This functionality is recommended for enterprises with complex
integration architectures. (e.g. a target field is derived from multiple source fields).
When do I
Debugging environments, often included, are useful for error-checking when
developing integrations, especially if staffing less experienced developers.
Info-Tech Research Group 57
Your timeline length depends on your integration, but the
steps are the same
If you are purchasing new hardware systems in
conjunction with your tool, begin by mapping and
connecting the new physical infrastructure before worrying
about the tools themselves.
Tools that will run on existing software typically feature
straightforward installations for the agent and
demonstration environment.
Prioritize integrations.
Start automating crucial integrations that are deemed
mission critical to the business. If you are using legacy
systems, start there, as these integrations are in most
urgent need of replacing.
Updating other lower priority integration points should be
undertaken as a subsequent phase.
Implement Operate
Vendors setup takes between 1 day and 1 month depending on your
environment. Ensure you are internally prepared for integration.
Start Integrating Data with a 5 Step Process:
Plan accordingly: More complex integrations will have a lower reduction in development time.
Quality Control
1. 2. 3. 4. 5.
Hand-coded Integration: 8 weeks
With Tools: 8 days 80% reduction
Hand-coded Integration: 24 weeks
With Tools: 9 weeks 62% reduction
Info-Tech Research Group 58
Avoid the pitfalls of using DI as a Service and integrating SaaS
SaaS vendors offer professional services to assist with data transfer, which
may be helpful for more complex integration projects.
Know When to Create a Custom Adapter
Some enterprises may elect to write their own connector
to facilitate bulk uploads. There are two triggers for this
Real-time integration with other
systems. Some enterprises may require near
real-time integration with other applications such
as ERP, marketing automation, or customer
service applications.
Excessive scale and complexity. Third-party
applications for bulk uploads will baulk with
transfers characterized by:
Over 200,000 records.
Over 1,000 data fields.
Most large SaaS vendors run off a single database
instance, which, while easier to maintain, presents
customer challenges:
It takes time. Moving batches of millions of
records takes a considerable amount of time.
Tweak the interface to improve performance.
Persistent connections are a must. Negotiating
thousands or millions of individual connections for
a batch transfer will cripple performance. Use the
persistent connection feature.
Multiple threads don't help. Starting multiple
sessions to transmit batched data will not help
performance due to authentication complications
and may even slow down performance.
Lock out unnecessary features. During batch
updates some usability features will constantly
update, leading to poor performance.
Overcoming negative SaaS perceptions is still an issue. When surveyed, 61% of Info-Tech respondents
believed SaaS to be more difficult to integrate than on-premise applications.
Info-Tech Research Group 59
Technology is rarely the issue that causes integration projects
to fail; effective scoping & training will promote success
Before diving in, scope out the project
Introduce training early to avoid user
Integration projects usually involve multiple
There could be one or more stakeholder for each
database that is being integrated.
An owner of a database and/or application may be
hesitant to allow updates from an external process.
Dont boil the ocean:
Integrating multiple systems is best achieved in small
Smaller, discrete scoping can limit the number of
stakeholders that need to be involved. Too many
cooks in the kitchen can spoil the dinner.
Make sure stakeholders are engaged early in the
project and get their buy-in:
Leverage management resources as necessary to get
stakeholders on side.
Getting stakeholders involved in the testing early and
often can help add to the success of the project:
Use an iterative approach to test early and test often
during implementation.
DI technologies continue to evolve. As many IT staffers
have relied on hand-coding, DI technologies may not
be universally understood or accepted.
Business requirements drive data needs. Review the
internal infrastructure. Realize that not every system will
need to be integrated and integration will be dependent
upon the scope of the business and data requirements.
Develop the internal culture for project acceptance,
include non-IT stakeholders to cover off all areas of data
Understand that there may be major shifts to be made
in terms of traditional data management practices
before DI technologies can be successfully implemented
within an organization.
Determine if there is a major gap in skills prior to
execution, as data integration requires staff with very
specific skills sets. Prepare to hire or retrain staff if that
gap exists.
Info-Tech Research Group 60
Expect developer resistance and overcome it through dialog
The market is moving away from custom coding but
some stubborn developers continue to hold their
organizations back.
Tools are strongly resisted by developers who feel the
tools take away control and freedom, yet the tools are
essential for data management maturity.
Developers will say the tool isnt robust enough which
comes from insecurity and fear of being replaced.
Programming is a part of the job they like doing so they
dont want to reduce it. They think tools mean they just
do data entry.
Yet system mapping and knowledge is more valuable.
They need to expand their thinking of what it means to
be a developer.
Managers dont train developers on data integration
tools. Address resistance through developer
Dont expect resistance to be demographically uniform,
or even logical, e.g. newer developers could expect
tools to be in place or fiercely resist them, feeling the
need to prove themselves through coding.
Regardless of developer reasoning, the key to
convincing developers is process: show them how
tools will make their lives easier and they will be
Developers want to Develop Tools are Friendly
I think getting a third party, a second voice that
theyre not used to hearing all the time, saying the
same thing in different words,
I think that made a significant impact.
- IT Director, Insurance
Research has shown that developer resistance is fading. Developers are finding that the tooling improves
their development processes, estimates, and outcomes. It is also freeing them up for more innovative
Info-Tech Research Group 61
Tools now provide broader functionality beyond traditional ETL and measurably improve efficiency and operational
DI as a Service (DIaaS) offerings are a viable option for organizations that are beginning to look at data integration and
have lighter integration needs.
Big Data, and how to manage it, has become a key IT strategy; however, organizations first need to determine if they
actually have a Big Data problem, or simply a classic data problem.
The overall value of BI is being recognized, but many solutions are still constrained by data quality issues. IT departments
need to utilize DI tools to ensure continual data quality as BI increases its role in decision making.
If a tool re-directs at least one developer from maintenance to innovation, it has probably paid for itself.
Organizations with poor data quality have difficulty getting budget approval. Make the case with the CIO and CFO that
data is an organizations best investment, and implementing DI tools will provide significant ROI.
Developing a data integration tool selection strategy is a function of your information architecture maturity, data
architecture, features needed, and developer staff size.
Tool effectiveness is highly vendor-dependent as many tools dont actually reduce necessary hand-coding.
Prioritize integration, include people, process and technology. And align tool deployment with existing architecture.
Develop your timeline in preparation for integration, and automate as much as possible.
Get user and stakeholder buy-in by scoping the project thoroughly and providing early training.
At some point data integration tools become necessary for business growth.
They improve overall operations, marketing, and customer experience.
Info-Tech Research Group 62
Whats in this Section: Sections:
Understand Data Integration
Trends and Considerations
Develop a DI Tool Selection
Evaluate DI Vendors
Develop Your DI Implementation
Vendor Landscape methodology
DI tools survey demographics
Info-Tech Research Group 63
Business Intelligence (BI) Processes and applications that provide a holistic view of business operations that are
used to support decision making.
Business Process
Management (BPM)
The continuous improvement of business processes to align them with customer
Data Agility The ability of a piece or group of data to be easily manipulated and adapted as needed
Data Architecture The design of data, systems and tools that support an organization or business
Enterprise Application
Integration (EAI)
Linking applications together through data integration.
Enterprise Information
Integration (EII)
Creating a single interface to view data from more than one source.
Extract, Transform and Load
The process of extracting data from an outside source, transforming it to meet needs
and loading it into the end target.
Legacy System Old hardware or software that is still used and contain valuable data.
Master Data Management
Creating a single and inclusive view of data.
Software-as-a-Service (SaaS) Enables a company to rent an application from a vendor that hosts it.
Service-Oriented Architecture
The process of building scalable distributed systems that treat all software
components as services.
Data Modeling The process of defining and analyzing enterprise data to better manage data and
improve business processes.
Info-Tech Research Group 64
Vendor Evaluation Methodology
Info-Tech Research Groups Vendor Landscape market evaluations are a part of a larger program of vendor evaluations, which includes
Solution Sets that provide both Vendor Landscapes and broader Selection Advice.
From the domain experience of our analysts, as well as through consultation with our clients, a vendor/product shortlist is established.
Product briefings are requested from each of these vendors, asking for information on the company, products, technology, customers,
partners, sales models, and pricing.
Our analysts then score each vendor and product across a variety of categories, on a scale of 0-10 points. The raw scores for each
vendor are then normalized to the other vendors scores to provide a sufficient degree of separation for a meaningful comparison. These
scores are then weighted according to weighting factors that our analysts believe represent the weight that an average client should
apply to each criteria. The weighted scores are then averaged for each of two high level categories: vendor score and product score. A
plot of these two resulting scores is generated to place vendors in one of four categories: Champion, Innovator, Market Pillar, and
Emerging Player.
For a more granular category by category comparison, analysts convert the individual scores (absolute, non-normalized) for each
vendor/product in each evaluated category to a scale of zero to four whereby exceptional performance receives a score of four and poor
performance receives a score of zero. These scores are represented with Harvey Balls, ranging from an open circle for a score of zero
to a filled-in circle for a score of four. Harvey Ball scores are indicative of absolute performance by category but are not an exact
correlation to overall performance.
Individual scorecards are then sent to the vendors for factual review, and to ensure no information is under embargo. We will make
corrections where factual errors exist (e.g. pricing, features, technical specifications). We will consider suggestions concerning benefits,
functional quality, value, etc.; however, these suggestions must be validated by feedback from our customers. We do not accept
changes that are not corroborated by actual client experience or wording changes that are purely part of a vendors market messaging or
positioning. Any resulting changes to final scores are then made as needed, before publishing the results to Info-Tech clients.
Vendor Landscapes are refreshed every 12 to 24 months, depending upon the dynamics of each individual market.
Info-Tech Research Group 65
Value Index Ranking Methodology
Info-Tech Research Groups Value Index is part of a larger program of vendor evaluations, which includes Solution Sets that
provide both Vendor Landscapes and broader Selection Advice.
The Value Index is an indexed ranking of value per dollar as determined by the raw scores given to each vendor by analysts.
To perform the calculation, Affordability is removed from the Product score and the entire Product category is reweighted to
represent the same proportions. The Product and Vendor scores are then summed, and multiplied by the Affordability raw
score to come up with Value Score. Vendors are then indexed to the highest performing vendor by dividing their score into
that of the highest scorer, resulting in an indexed ranking with a top score of 100 assigned to the leading vendor.
The Value Index calculation is then repeated on the raw score of each category against Affordability, creating a series of
indexes for Features, Usability, Viability, Strategy and Support, with each being indexed against the highest score in that
category. The results for each vendor are displayed in tandem with the average score in each category to provide an idea of
over and under performance.
The Value Index, where applicable, is refreshed every 12 to 24 months, depending upon the dynamics of each individual
Info-Tech Research Group 66
Product Pricing Scenario & Methodology
Info-Tech Research Group provided each vendor with a common pricing scenario to enable normalized scoring of Affordability,
calculation of Value Index rankings, and identification of the appropriate solution pricing tier as displayed on each vendor scorecard.
Vendors were asked to provide list costs for Data Integration Tools and/or licensing to address the needs of a reference organization
described in the pricing scenario.
Additional consulting, deployment, and training services were within the scope of the pricing request, as was the cost of enhanced
support options, though vendors were encouraged to highlight any such items included with the base product acquisition. The annual
maintenance rate was also requested, allowing a three-year total acquisition cost to be calculated for each vendors data integration
solution. This three-year total acquisition cost is the basis of the solution pricing tier indicated for each vendor.
Finally, the vendors three-year total acquisition costs were normalized to produce the Affordability raw scores and calculate Value Index
ratings for each solution.
Key elements of the common pricing scenario provided to Data Integration vendors included:
Includes development and production, licensing and annual maintenance for 3 years. In addition, any additional benefits that may be
associated with the integration scenario i.e. ancillary licensing and/or training and professional services.
Company information:
10,000 employees , 15 locations, 120 IT staff that includes 20 internal developers dedicated to development/integration
Infrastructure information:
6 dual processor mission critical application servers located in a central data center and 2 quad processor database servers 1
runs SQL, the other runs Oracle.
Separate backup site that includes redundancy for all mission critical solutions.
Teradata Data Warehouse Appliance licensed for 3000 users within the company
Other applications in use include: an ERP system, an HR system, a Finance system, Exchange , 5 additional internally
developed software solutions (including the company website, which is hosted internally).
All 10,000 employees have access to certain components of each of the systems.
Info-Tech Research Group 67
Data Integration Tools Survey Demographics
Full-time Employees
IT Employees
Info-Tech Research Group 68
Info-Tech Research Group 69
Info-Tech Research Group 70
Info-Tech Research Group 71
Full-time Employees
Info-Tech Research Group 72
IT Employees