Академический Документы
Профессиональный Документы
Культура Документы
(2019-20)
Subject: Introduction to Business Analytics
(Scheme & Solution)
Instructions:
• All questions are compulsory in Part A, B, C.
• Attend any three questions in Part D.
4 Expand RFID
a. Radio Frequent Identification b. Radio Frequency Identicator
c. Radio Frequency Identification d. None
11 Expand CRISP-DM
a. Cross Industry Standard Process for b. Cross ln-Standard Process for Data
Data Mining Mining
c. Cross Industry Standard Procedure for
Data Mining d. None
24 Expand ECM
a. Expertise Content Management b. Enterprise Content Management
c. Enterprise Commission Management d. None
Select all which are not the examples of Business intelligence and performance
25 monitoring.
a. Reporting b. Balances scorecards
c. Both (a) & (b) d. None
• ETL tools: The main objective of the ETL tool is to drag data from one or several source
systems, convert data of many different formats into a common format and then load that data
into the data warehouse.
• Query tools and reporting: The query and report tool helps to run regular reports Creates
lists of organizations and executes reports in tables and queries. The query and reporting tools
are meant to allow the users to interact directly with the organization’s data.
• Data warehouse: The collection of appropriate business data that are validated and
organized so that it can be analyzed to support business decision-making. The data which are
used to populate the data warehouse are extracted from distributed databases, often unrelated
data and in some cases, external to the organization that is using it.
• OLAP techniques: These techniques are used to study complex data in real time in a
database, which is constantly updated with transaction data. OLAP will help managers to study
data from different perspectives and explore them to discover hidden information.
o Innovation Matrix: The ability to quickly get newer in a more advanced fashion
• Customer-Interface Metrics
o Customer’s perception of the firm’s performance on each level of the 7 C’s of the
customer interface
▪ Example: customer’s rating of customization compared to the competitors
o Other critical interface metrics
▪ Including value proposition
o Customer-interface and customer-outcome metrics
• Business-model Metrics
o Value Proposition or Cluster Benefits Metrics –
▪ Attention to consumer perceptions
▪ Performance related to the competitor's benefits
▪ target segments, benefits offered, capabilities that drive benefits
o Capture subcomponents of the business model: financial metrics resource
systems, egg diagram resource systems
• Opportunity Metrics
o Assesses the market-opportunity metrics accurately measure market opportunity
• Financial Metrics
o Revenues
o Costs
o Profits
o Balance-sheet metrics
5. What are the benefits of bigdata.
• Anything involving customers could benefit from big data analytics. This
includes better-targeted social-influencer marketing, customer-base segmentation,
and recognition of sales and market opportunities. Recent economic changes
worldwide have changed consumer behaviors. Big data analytics can help develop
definitions of churn and other customer behaviors, as well as an understanding of
consumer behavior from clickstreams.
• Business intelligence in general can benefit from big data analytics. This could
result in more numerous and accurate business insights, an understanding of
business change, better planning and forecasting, and the identification of root
causes of cost.
• Specific analytic applications are likely beneficiaries of big data analytics. For
example, consider analytic applications for the detection of fraud, the quantification of
risks, or market sentiment trending. At the leading edge, big data analytics might help
automate decisions for real-time business processes such as loan approvals or fraud
detection.
Part D - Eight-mark questions
1. Explain BAO reference architecture functional pillars and foundational layers with diagram.
The BAO reference architecture uses a combination of a functional pillar and a foundational
layer approach to promote simplicity, to enhance understanding, and to ensure
comprehensive coverage: Pillars (across the top of Figure 4) defines the main functional
components that typically make up a comprehensive BAO solution. The pillars categorize
components that can support a single solution set, such as Enterprise Content Management
or Master Data Management (MDM). Because of the modularity and flexibility of the
architecture, it can provide a common solution framework for BAO across all sectors,
industries, and solution areas.
Layers (under the pillars in Figure 4) provide the robust and cohesive foundation where the
BAO pillars can function in an integrated and seamless fashion. Let us look at the BAO
reference architecture a little closer by examining each pillar and the foundational layer:
A source identifies all of the diverse sources of data that are available within and outside an
organization that is accessed and used as part of a comprehensive BAO environment. These
data sources include data from enterprise applications, structured and unstructured data,
master or reference data, data from external sources (like the web and mobile devices), device
information (from sensors).
Data management solutions manage stores and deliver the trusted data throughout the data
supply chain. Information management supports the following key capabilities:
Content management, which consists of services, technologies, and processes that are used
to capture, manage, store, preserve, and deliver unstructured content. It provides global
access, workflow, and management of digital assets that are used for integrating and sharing
data between a company and its employees, customers, suppliers, and business partners.
Master data management, which is a combination of disciplines, technologies, and solutions
to create and maintain the stable, complete, contextual, and specific business data for all
stakeholders across and beyond the enterprise.
Data integration, which provides a uniform way to manage enterprise data flows including
batch, real-time, and transactional message processing. The data integration components
focus on the processes and environments that deal with the capture, qualification,
processing, and movement of data. Data can flow directly to the consuming systems, or it
can be prepared for storage in a data repository. The data integration components can
process data in scheduled batch intervals or in real-time (streaming), near real-time, or just-
in-time intervals. Data integration also supports distributed transaction staging, depending on
the nature of the data and the business purpose for its use.
Data repositories, which supplies an necessary foundation for the storage of functional and
reshaped data that improves the business value. The data repositories are not a replacement
or replica of operational databases. Rather they are a complementary set of heterogeneous
data repositories that reshape data into formats that are necessary for making decisions and
managing a business.
The use of Big Data is becoming crucial to leading companies to outperform their peers. In
most industries, established competitors and new entrants alike will leverage data-driven
strategies to innovate, compete, and capture value. In healthcare, data pioneers are
analyzing the health outcomes of pharmaceuticals when they are widely prescribed, and
discovering benefits and risks that were not evident during clinical trials. Other early adopters
of Big Data are using data from sensors embedded in products from children’s toys to
industrial goods to determine how these products are actually used in the real world. Such
knowledge then informs the creation of new service offerings and the design of future
products. In addition to the sheer scale of Big Data, the real-time and high-frequency nature
of the data are also important. The ability to estimate metrics such as consumer confidence,
something which previously could only be done retrospectively, is becoming more
extensively used, adding considerable power to prediction. Similarly, the high frequency of
data allows users to test theories in near real-time and to a level never before possible.
Need for Big Data
There are five broad ways in which using big data can create value.
• Big Data can unlock significant value by making information transparent. There is still a
significant amount of information that is not yet captured in digital form, e.g., data that are
on paper, or not made easily accessible and searchable through networks.
• As organizations create and store more transactional data in digital form, they can collect
more accurate and detailed performance information on everything from product
inventories to sick days and therefore expose variability and boost performance. In fact,
some leading companies are using their ability to collect and analyze big data to conduct
controlled experiments to make better management decisions.
• Big Data allows ever-narrower segmentation of customers and therefore much more
precisely tailored products or services.
• Sophisticated analytics can substantially improve decision-making, minimize risks, and
unearth valuable insights that would otherwise remain hidden.
• Big Data can be used to develop the next generation of products and services. For
instance, manufacturers are using data obtained from sensors embedded in products to
create innovative after-sales service offerings such as proactive maintenance to avoid
failures in new products.
• Building a data mining model is part of a larger process which includes everything
from asking questions about the data and creating a model to answer those questions,
to deploying the model into a working environment.
• This process is defined by the six basic steps
– Defining the Problem.
– Preparing Data.
– Exploring Data.
– Building Models.
– Exploring and Validating Models.
– Deploying and Updating Models.
Data understanding: Verify the data which is in document, identify the data controlment and
problems in data quality.
• Gathering data.
• Verifying quality.
• Exploring.
• Describing.
Data preparation:
• Constructing data.
• Selecting the data.
• Integrating the data.
• Formatting data for modelling.
• Cleaning data.
Modelling: Within the data to identifying the patterns, we are using the mathematical
techniques. It includes:
• Selecting the techniques.
• Designing test.
• Building the models
• Assessing the models.
Deployment: To work in daily business, use your discoveries:-
• Deployment planning.
• Final result reporting.
• Review final results.