Вы находитесь на странице: 1из 5

Big Data

Big data is a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools or traditional data processing applications. The challenges include capture, storage, search, sharing, transfer, analysis and visualization. The trend to larger data sets is due to the additional information derivable from analysis of a single large set of related data, as compared to separate smaller sets with the same total amount of data, allowing correlations to be found to "spot business trends, determine quality of research, prevent diseases, link legal citations, combat crime, and determine real-time roadway traffic conditions." As of 2012, limits on the size of data sets that are feasible to process in a reasonable amount of time were on the order of Exabytes of data. Scientists regularly encounter limitations due to large data [11] sets in many areas, including meterology, genomics, complex physics simulations, and biological and environmental research. The limitations also affect internet search, finance and business informatics. Data sets grow in size in part because they are increasingly being gathered by ubiquitous information-sensing mobile devices, aerial sensory technologies , software logs, cameras, microphones, radio-frequency identification readers, and wireless sensor networks The world's technological per-capita capacity to store information has roughly doubled every 40 months since the 18 1980s; as of 2012, every day 2.5 quintillion (2.510 ) bytes of data were created. The challenge for large enterprises is determining who should own big data initiatives that straddle the entire organization. Big data is difficult to work with using most relational database management systems and desktop statistics and visualization packages, requiring instead "massively parallel software running on tens, hundreds, or even thousands of servers". What is considered "big data" varies depending on the capabilities of the organization managing the set, and on the capabilities of the applications that are traditionally used to process and analyze the data set in its domain. "For some organizations, facing hundreds of gigabytes of data for the first time may trigger a need to reconsider data management options. For others, it may take tens or hundreds of terabytes before data size becomes a significant consideration."

Advanced Analytics The banking industry is data-intensive with typically massive graveyards of unused and unappreciated ATM and credit processing data. As banks face increasing pressure to stay profitable, understanding customer needs and preferences becomes a critical success factor. New models of proactive risk management are being increasingly adopted by major banks and financial institutions, especially in the wake of Basel II accord. Through Data mining and advanced analytics techniques, banks are better equipped to manage market uncertainty, minimize fraud, and control exposure risk.

According to IBMs 2010 Global Chief Executive Officer Study, 89 percent of banking and financial markets CEOs say their top priority is to better understand, predict and give customers what they want. Financial metrics and KPIs provide effective measures for summarizing your overall bank performance.

But in order to discover the set of critical success factors that will help banks reach their strategic goals, they need to move beyond standard business reporting and sales forecasting. By applying data mining and predictive analytics to extract actionable intelligent insights and quantifiable predictions, banks can gain insights that encompass all types of customer behavior, including channel transactions, account opening and closing, default, fraud and customer departure.

Insights about these banking behaviors can be uncovered through multivariate descriptive analytics, as well as through predictive analytics, such as the assignment of credit score. Banking analytics, or applications of data mining in banking, can help improve how banks segment, target, acquire and retain customers. Additionally, improvements to risk management, customer understanding, risk and fraud enable banks to maintain and grow a more profitable customer base. The importance of these measures has been implied in the Basel II accord that explicitly emphasizes the need to embrace intelligent credit management methodologies in order to manage market uncertainty and minimize exposure risk.

While analytics arent exactly new to the world of banking, plenty of banks are gearing up for their next big analytics push, propelled by a load of data and new, sophisticated tools and technologies. Why has business analytics jumped to the top of the priority list for banks? Pick a reason. Regulatory reform, managing risk, changing business models, expansion into new markets, a renewed focus on customer profitability any one of these is reason enough for many banks to reconsider what todays analytics capabilities can offer.

A host of significant, recent changes in the banking industry have resulted in a long list of business challenges that the practice of business analytics may be positioned to address. A number of financial institutions have been quick to recognize and adopt this emerging technology and it is changing the banking landscape and giving banks and financial institutions previously untapped savings, margins and profit. For example, Bank of America Merrill Lynch is using Hadoop technology to manage petabytes of data for advanced analytics and new regulatory requirements.

As per Deloitte research, three business drivers increase the importance of analytics within the banking industry

Regulatory reform Major legislation such as Dodd-Frank, the CARD Act, FATCA (Foreign Account Tax Compliance Act) and Basel III have changed the business environment for banks. Given the focus on systemic risk, regulators are pushing banks to demonstrate better understanding of data they possess, turn data into information that supports business decisions and manage risk more effectively. Each request has major ramifications on data collection, governance and reporting. Over the next several years, regulators will finalize details in the recently passed legislation. However,

banks should start transforming their business models today to comply with a radically different regulatory environment. Customer profitability Personalized offerings are expected to play a big role in attracting and retaining the most profitable customers, but studies show that a small percentage of banks have strong capabilities in this area. The CARD Act and Durbin Amendment make it even more important to understand the behavioral economics of each customer and find ways to gain wallet share in the most profitable segments. Operational efficiency while banks have trimmed a lot of fat over the past few years, there is still plenty of room for improvement, including reducing duplicative systems, manual reconciliation tasks and information technology costs. - See more at: http://www.ibmbigdatahub.com/blog/analytics-bankingservices#sthash.rNWmpfnJ.dpuf

Facts on big data and advanced analytics

By 2020 one third of all the data will be stored or have passed through the cloud and we will have created 35,000,000,000,000,000,000,000,000 bytes of data

1986 Data volume -3 exabytes Data variety 99% analog Computing capacity 0.001bn of MIPS 2007 Data volume- 295 exabytes Data variety 94% digital Computing capacity-6.380 bn of MIPS Sources http://en.wikipedia.org/wiki/Big_data http://www.ibmbigdatahub.com/blog/analytics-banking-services http://www.slideshare.net/McK_CMSOForum/big-data-and-advanced-analytics http://www.mckinsey.com/insights/marketing_sales/putting_big_data_and_advanced_analytics_to _work http://www.advancedanalytics.org/ There is a whole ppt on this big data and advanced analytics on slideshare.com which i have already shared it will be very helpful

Вам также может понравиться