1) The document discusses the need for universal financial data standards like Legal Entity Identifiers (LEIs) to effectively monitor systemic risk.
2) The current LEI system faces challenges around data quality, lack of standardization between Local Operating Units, and not accommodating real-time aggregation needs.
3) A next-generation LEI system is proposed using blockchain technology run by Local Operating Units to provide global updates on a faster timescale while maintaining a complete historical record. This would help address current challenges around speed, scalability, and data quality.
2. The need for Quality Hierarchal LEI
data, DTI?? and other emerging
Universal Financial data standards (UTI,
UPI, et al) are enabled thru advanced
contextual graph models plus
exploitation of Blockchain technological
disruptive capability
Ontology2
4/7/2016
Bill Freeman, Paul Houle , Allan Grody, patent
pending
3. Daily, GlEIF concatenates and archives the
data
Financial Stability
Board, CFTC,OFR,
Treasury
Reg Oversight
Committee, Basil,
EC, Global
regulators
Global Legal Entity
Identifier Foundation
Local Operating Units
LOUs register entities and
supply data files to GLEIF
established oversees
Global LEI System(GLEIS): A Work in
ProcessCurrent State Under Stress
Individual Records are supposed to be updated at least yearly, but lapse rates for such updating is 25%
(although they should LEI records must be updated sooner for corporate actions to make the GLEIS relevant for use
as golden copy for internal use by financial institutions
Ultimately Needed : Real-time , streaming data analytics to monitor SIFI, trading , and the buildup of systemic risk
capital adequecy
LOUs publish daily updates, which are consolidated daily by the GLEIF. Process and data variables at LOUs
problematic (i.e. ___________, ____________, __________) Note;add examples please
A complete snapshot of the data is kept every day, giving a valuable record of system history
Ontology2
4/7/2016
4. Current Challenges
Uncertain use and value of LEIs in recording transactions in Trade Repositories
(TRs) tens of billions of transactions have been sent to TRs with no way of
validating LEIs usefulness and quality
Different Internal Data and Communication Models at LOUs will prohibit advanced
technology deployment if not standardized
LEIs as mechanism to aggregate data on a hierarchical relationship basis is being
put off to an uncertain outcome
The use of the GLEIS for aggregating financial transaction data for systemic risk
analysis, the key objective for regulators, is not yet accommodated in either the rule
making or the design of the GLEIS
The promise to financial industry members of straight-through-processing,
significant infrastructure cost reduction and risk mitigation is not being realized, nor
is it being identified in industry plans near term or in the future
Little standard operating Procedure
Manual
Mutual Funds Ignored ( oretty much ) if not
Challenging to understand the quality of information, sources and audit
5. Reference Data As Seen By Financial IT
Trading software
Entity data Product data
Trade data
Analytics
Reporting
Risk management
Straight-through processing (risk elimination)
Needs immediate reaction to changes
affecting entities
Analytics, Reporting (Risk management)
Need to understand current state AND also
what reference data was relevant to the trade
at the time
Ontology2
4/7/2016
6. Current LEI System is Not Fast,Scalable or
Secure
Update Rates Suffer
5 days of daily updates requires 5 copies of the data
5 days of hourly updates requires 120 copies of the data (Note: assume you are showing a single
LOU example how many hours of operation to get to the calculations shown? Needs better
example
and hourly might not be fast enough to keep up with real-time trading and changing conditions
under stress situations
Ontology2
4/7/2016
7. Next-Generation LEI System
Global Legal Entity
Identifier Foundation
manages
Blockchain run collectively by
Local Operating Units
Blocks in chain represent changes to reference data published
by Local Operating Units in conjunction with their registrants
and their customers
We get best of both worlds:
Global update of reference data updated on a few second
timescale
AND
Complete historical record at time of trade (keep block id for
exact reference!)
Ontology2
4/7/2016
8. Ontology2 Technology Enhances
Blockchain
Lambda Architecture operation (Superbatch ) to process updates in
either a large batch (whole database) or in small batches (one block)
Perfect for Blockchain!
Ablility to accept or reject changes from blockchain to minimize the
effect that incorrect or malicious data from blockchain impacts
operations, reporting or analysis
Semantic capability to represent, reason, and work with multiple
representations of a situation (integrate blockchain-sourced reference data
with legacy and future systems)
Ontology2
4/7/2016
9. Index Cloud
New Index Construction Never Conflicts With Production
time
Old index (multiple copies for throughput & availability)
Source
data
Test
Clone
New Index
Manage and
recover resources
Ontology2
4/7/2016
10. Intelligent Index ( ML ) plus Real-Time Index
Build can be self correcting
Effortless and efficient scalability
Message
Queue
Bulk Data
time stamped
master data
small real-time
index
large
bulk
index
merger
RESULTS
Ontology2
4/7/2016
11. New approach to data management
A FRAMEWORK FOR DATA QUALITY
Multiple sources of
instance data
Facts
classifications
Reference data
Examples
Test Data
Training Data
Requirements
Quality metrics
Ontology2
4/7/2016