Вы находитесь на странице: 1из 22

Real Time Banking Applications

Ajay Uke
18th July 2018

© 2016 Cognizant

| © 2016 Cognizant
Background

• The banking sector is moving


increasingly towards real-time,
straight through processing.

• This is a long-term shift that has


been going on already for many
years, but it has recently gained
momentum driven largely by new
business models and new
technology platforms.

2 | © 2016 Cognizant
Real-Time Payments: Global View

• Absence of universally accepted definition of Real Time Payments.


• Implementation drivers ranging from need for robust payment infrastructure and restrictive capabilities of current alternatives to
regulatory mandate
• Features differ from implementation to implementation (Interbank Account to Account, Universal Access, ISO Standards, Settlement
times, maximum value, fees etc. )
• Different value added services offered by Banks leveraging Real Time Payments – e.g. Agency Banking, Simplified Cross-Border
payments.

3 | © 2016 Cognizant
Systems affected by Real-Time / Faster Payments

•Channels must be updated to include provisions to initiate Real Time Payments transactions
Payment Channels / ​User Interfaces
•Might require building entirely new capabilities on some channels such as Agency Banking channels

•The Engine should encapsulate new rule e.g. pertaining to Transmission and Settlement Windows
Core Payment Engine
•Engine should be enhanced to do posting to accounts in real time for incoming & outgoing messages

•Real Time Payments is expected to enhance enrichment of Payments/Remittance Data


Transaction/Payments Database
•New reference data capabilities will be needed in order to capture and process extended Data set

•Choice of Clearing & Settlement model by Fed/NACHA will impact Net Settlement & Payment Gateway
Payment Gateway & Settlement Systems
•Changes are envisaged both on existing inward & outward gateways and the inward & outward APIs

•New product will introduce need to update the Charging functionality to incorporate new charges
Charging, Billing & Statements •New product will impose costs on banks w.r.t Implementation, Operation & Opportunity cost
•This will mandate changes in the Billing and Statements methodologies adopted by the Banks

•More or less all the GL & Accounting system registers the transactions through batch processes
General Ledger
•Changes in GL & Accounting system will be needed to post transactions on a Real Time Basis

•Major challenges for banks envisaged to manage the liquidity & settlement limit monitored by Fed
Liquidity & Reconciliation
•System overhaul will be need to monitor bank’s liquidity, particularly during non-banking hours.

•To comply with regulations, banks must perform anti-money laundering (AML) on a real-time basis
Compliance, Anti-Fraud & AML systems
•Complete review of channel security standards is, given the irrevocable nature of real time payments

4 | © 2016 Cognizant
Impact of RTP on Payment Value Chain Functions
Real Time Payments has a significant impact on Operations and IT across the Payments Value Chain

Impact on Payment Value Chain Impacted systems


Payment
Payment Payment Routing and Other Payment Channels / ​UI (online
Initiation & Accounting Reporting
Validation Processing Interface Processes banking, mobile banking,
Tracking
branch, IVR)
AML/ Accounting Message Billing and Invoice Transaction Core Payment Engine
Payment Entry Authentication
Sanctions Interface Generation Tracking Monitoring (Real Time Payment Processing
System)
Debit / Credit Accounting Receive Inward Data Mining / Data Integration with
Payment Inquiry Authorization
Account Rules Payment Warehousing Agency Banking Transaction/Payments
Database (Payment Repository)
Funds Outward Advice to ERP Integration with
Bulk Upload Enrichment
Control
Settlement
Payment Systems
Payment Gateway &
Lending
Settlement Systems ( e.g. RTP
Gateway)
Duplicate Exception Posting/ Balance Generate Debit / Integration with
Notifications Control Decisioning Updates Credit Advices Trade Finance Charging, Billing &
Statements
Exception Payables /
Report Generation Charge Calculation Reconciliations Transaction History Receivables
Repair Accounting Systems (e.g.
Management
Hogan DDA)
Format Liquidity
Auditing Validation Management Analytics
Liquidity & Reconciliation

Route Transaction
Compliance, Anti-Fraud & AML
systems (e.g. Actimize)
Payment Very High / High Low/Medium
Rejects Impact Impact

5 | © 2016 Cognizant
“Information Half-Life” and the Diminishing Value of Data

https://www.slideshare.net/AmazonWebServicesLATAM/big-data-analytics-innovating-at-the-speed-of-light

6 | © 2016 Cognizant
Real-time Banking Conceptual Reference Architecture

• The primary differentiator of a so-called Data


DataSource
Data Source
Source(s)
“real-time” architecture is that it is “event-
driven” (from a strict computer science
standpoint, this is something quite distinct Ingestion
from “real-time computing”). Ingestion
Ingestion Process / APIs
Process B
• There is no batch processing. All reports and Business Process A

Message Oriented Middleware


computations are up-to-date all the time. Process C
Stream Processing
• The heart of the architecture is typically
Process D
based on some kind of message-oriented
middleware. This area has seen recent Business Logic
innovations and current architectures can
support very high throughput.
Persistence
• Data sources are decoupled from business
processes using the messaging paradigm. Data Lake
Reporting Data Access APIs
• It is typical to apply recent database /
persistence concepts, e.g. NoSQL databases
and horizontally scalable filesystems. Analytics
Archiving Data Consumer(s)
“Documents” in such stores often correspond
closely to messages.

7 | © 2016 Cognizant
Example of an Actual Hybrid Real-time / Batch Processing Architecture Developed
by Cognizant

8 | © 2016 Cognizant
Example of a Real-Time Payments Infrastructure High-Level Architecture
Published by Fintech Orwell

9 | © 2016 Cognizant
Kafka, Apache Spark

Apache Kafka is a distributed system commonly described as Apache Spark is an open-source project for fast distributed computations and
scalable and durable message commit log. Kafka often acts as a processing of large datasets. It operates primarily in memory and can use
reliable event ingestion layer, that can durably store and aggregate resource schedulers such as Yarn, Mesos or Kubernetes. Spark can perform
events coming from multiple sources, and that can act as a single processing with distributed datasets from external storage, for example HDFS,
source for different consumers to receive multiple types of events Cassandra, HBase, etc.

https://lenadroid.github.io/posts/kafka-hdinsight-and-spark-databricks.html

| © 2016 Cognizant
Azure Kafka Apache Spark to build real time Applications

| © 2016 Cognizant
Some Benefits of Real-time Banking
• Up-to-date data has higher value. It is a general truth of information systems that recent data is more valuable than older
data – some commentators refer to this as an “information half life”. This depreciation in value is also true on (very) short
timescales – it can be the difference between winning and losing deals, or the difference preventing a fraud or attack before
it is too late. There is also a big impact on user experience, which has value in its own right.
• https://blogs.oracle.com/analyticscloud/the-half-life-of-data-and-the-role-of-analytics

• New business models and “open banking”. Real-time data may allow new services to be constructed. For example, it may
allow integration with third-parties via APIs, the data itself may become a product, it may remove a barrier that prevented a
certain type of financial instrument from being viable.
• https://go.forrester.com/blogs/your-data-is-worth-nothing-unless-you-use-it/

• Meeting employee expectations and productivity improvements. Employees may simply expect real-time processing
because that is what they experience in other sectors. It is also possible that having up-to-date information will improve
productivity due to reduced waiting times or prevention of rework.
• https://www.ibm.com/blogs/internet-of-things/the-last-best-experience/

• Real-time compliance. Compliance, both internally and from regulators, will increasingly demand more up-to-date
information and will gradually move towards a real-time model. This is an example of how external factors will drive the
shift to real-time architectures.
• http://paymentsjournal.com/real-time-forensics-cognitive-automation-how-ai-is-set-to-transform-banking-payments/
• https://www.acamstoday.org/keeping-up-with-technology-compliance-in-real-time/
12 | © 2016 Cognizant
Ingestion, Streaming, Storage products

| © 2016 Cognizant
Technology and Solution Highlights

Technology Solution Highlights

• Kafka
Scalability (High data volume)
Ingestion • Apache Kafka for event aggregation
and ingestion

• Spark Fault Tolerance, Reliability and Durability


Streaming
• Apache Spark for stream processing

• Python
Processing • Scala, Java or R can also be used Component based Plug and Play (Extensibility)
for processing

• Mongo DB, Node.js and Python Micro services based architecture (TPA)
Reporting
Scripting

| © 2016 Cognizant
Appendix

| © 2016 Cognizant
Some References, Background and Tutorials…
…to help you to generate ideas and understand the concepts.
• Some Viewpoints on Event-Driven Architecture
• https://serverless.com/blog/matthew-lancaster-event-driven-architecture-transform-banking-emit-2017/
• https://martinfowler.com/articles/201701-event-driven.html
• https://content.pivotal.io/blog/how-to-deliver-an-event-driven-architecture
• https://docs.microsoft.com/en-us/azure/architecture/guide/architecture-styles/event-driven

• Kafka Stream Processing


• https://kafka.apache.org/11/documentation/streams/tutorial

• Azure Streaming Analytics and Power BI


• https://azure.microsoft.com/en-us/blog/streamanalytics-and-powerbi/

• Akka Stream Processing in Java


• https://doc.akka.io/docs/akka/current/stream/index.html?language=java

• Apache HBase
• https://hbase.apache.org/

16 | © 2016 Cognizant
Some Ideas for Dev Challenges

• Use Kafka stream processing and machine learning to identify transactions that may be anomalies.
• https://www.slideshare.net/KaiWaehner/apache-kafka-streams-machine-learning-deep-learning

• Define and create an API for your application so that it can ingest data from external sources or so that its real-time data
can be easily consumed.
• https://dzone.com/articles/api-first-approach-and-api-management-with-swagger

• Can you use Akka graphs to merge multiple input streams for your business process?
• https://doc.akka.io/docs/akka/2.5/stream/stream-graphs.html

• Consider re-engineering your existing data model to NoSQL. How will you provide replacements for triggers, integrity
checks, data model versioning, stored procedures? Will your tables map directly to documents, or can you use structured
documents in NoSQL more effectively?
• https://ac.els-cdn.com/S1877050915011758/1-s2.0-S1877050915011758-main.pdf?_tid=4f1d4828-85b4-4fcb-ba8c-
a6f931aed622&acdnat=1527624680_dcdee32951e56051f86dd52b898bf870

17 | © 2016 Cognizant
If you’re stuck for ideas then you could follow an existing tutorial, then
try to modify it for your domain.
https://www.youtube.com/watch?v=Ap3MTTVPcHk&list=PLTgRMOcmRb3NetGD3XZcus9E8sC6MxKBE
https://www.youtube.com/watch?v=HxhoAkhor-w

Ingestion
Ingestion
Ingestion Process / APIs
https://www.youtube.com/watch?v=Zqm7XJFhMc0
https://www.youtube.com/watch?v=b_cu_-LYe3U
Business Process A

Message Oriented Middleware


Stream Processing

https://www.youtube.com/watch?v=7hHsqmbJ3oA

Business Logic
https://www.youtube.com/watch?v=0YIBHfgasok

Persistence
Data Lake
Reporting Data Access APIs

https://www.youtube.com/watch?v=HxhoAkhor-w
Archiving User Interfaces Analytics

18 | © 2016 Cognizant
Another approach could be to take an existing project from GitHub and try to
modify it to fit your domain.

• These are some example projects that are available on GitHub -

• API Management Layer


• https://github.com/nanovazquez/workshop-azure-api-management

• Analytics and Data Lake


• https://github.com/amynic/TechHer

• Advanced Analytics and AI


• https://github.com/dem108/MicrosoftCloudWorkshop-Asia

• Continuous Data Processing with Apache Spark and Azure Event Hubs
• https://github.com/Azure/azure-event-hubs-spark

19 | © 2016 Cognizant
Some background material regarding real-time banking.

• BusinessWire: “Nearly 80 Percent of Banks Expect Real-Time Payments and Open Banking
Will Drive Payments Transformation Over Time”
• https://www.businesswire.com/news/home/20180430005162/en/80-Percent-Banks-Expect-
Real-Time-Payments-Open

• Accenture: “REAL-TIME PAYMENTS FOR REAL-TIME BANKING


• How banks can seize the full opportunities of immediate payments”
• https://www.accenture.com/nl-en/insight-realtime-payments-for-realtime-banks

• Deloitte: “Banking Industry Outlook: Banking reimagined”


• https://www2.deloitte.com/pg/en/pages/financial-services/articles/banking-industry-outlook-
future-banking.html

• IBM: “The Future of Banking: Mobile, Data-Driven, Real-Time”


• https://www.ibm.com/watson/infographic/discovery/future-banking/

20 | © 2016 Cognizant
GPC Issue
Sources Accounting Data Warehousing Reporting

FPL Portal
LRR

FPL Portal
BTS PAR BEST FAS
4 am-5 am 5 am-7 am
10 pm-1 am 1 am-3 am
Existing Flow

ODS FRD

GBM/GBM-
CAL
PAR-FAS 4 am-5 am
10 pm-1 am

Bal
3 am-4 am DWH
Exports
6 am-7 am
PWH

4 am-5 am
10 pm to 1 pm 1 am -2 am 2 am -3 am 2004 2012
2 am-5 am

BEST FAS
BTS LRR
New Products

6 am-7 am

FPL Portal
FPL Portal

6 am-8 am
PAR

GBM/GBM-Bal
CAL ODS
1 am-3 am FRD
5 am-6 am
Diba Neuhans DWH

5 am-7 am
PAR-FAS
Exports
3 am-4 am
Kondor PWH 7 am-9 am
2 am-6 am

Meeting SLA Missing SLA


| © 2016 Cognizant
ING Example Case: Isolation between GS and other Business Units
Data
DataSource
Source
Data Source(s) Business Unit to GS Gateway

Ingestion
Ingestion
Ingestion Process / API Ingestion Process / API

Message Oriented Middleware


Process B
e.g. DBNL Business Process X GS Business Process A

Message Oriented Middleware


Stream Processing Stream Processing Process C

Process D
Business Logic Business Logic

Persistence Persistence

Data Lake
Reporting Data Access APIs Reporting

Archiving Data Consumer(s) Archiving Analytics

Other Business Unit Scope GS Scope

22 | © 2016 Cognizant

Вам также может понравиться