“Go beyond fairy tales because the devil is in the details” - regulators around the world have started realizing the importance of going deeper than the mere “sums & differences” reported in submissions from their regulated entities. Learning lessons from the global financial crisis (GFC) of the last decade, they know that the practice of collecting information through reports that are aggregated at highest-levels is not as prudent as was widely believed before the crisis.
The lack of regulator’s ability to see-through the reported data vastly hindered their ability to spot changing market dynamics, excessive risk taking trends or any unfair business practices undertaken by firms under their jurisdiction.
With the aggregated data reporting in reports, it was also not uncommon for the industry players to track back on their reported figures by changing & re-submitting them. Regulators (albeit annoyed) could neither effectively curb the re-submissions of returns nor could they identify the root-cause or issues locked within the underlying data of the re-submitting entities.
It wasn’t merely because of the convenience involved in collecting only a handful of files from complying entities but the lack of safe & efficient technology to periodically collect larger chunks of data; that was also holding the regulators back from imposing transactional data submission requirements.
However, with the advancement in technology and changing times post-GFC that demanded more prudence from regulators; recurring data gathering practices at transaction levels were initiated in multiple jurisdictions.
The European Central Bank (ECB) in 2011 proposed to kick-off transactional data collection of deposits and loans to non-individuals under the project named AnaCredit (analytical credit datasets). Under the requirement, all lenders operating in the Eurozone are obliged to submit about 100 data attributes at instrument/deal levels for the loans against entities that when aggregated, crossed a €25,000 threshold. The project reportedly came into practice with the first stream of data collected in September 2018 by the national supervisors further transmitted to the central data repository of ECB.
Another massive transaction data gathering exercise that was initiated in the aftermath of GFC is the G20 OTC Derivatives Market Reforms. The largely unregulated OTC derivatives around the world were attributed to be the biggest cause of financial distress in economies around the globe. Therefore, G20 nations collectively committed to mitigate systemic risks in the OTC markets by encouraging the growth of standardised and centrally cleared derivative deals and the collection of transaction level reporting through licensed or prescribed trade repositories.
One of the globally realised phenomena is that, although regulators perform an effective job of supervising & controlling the financial entities, they have failed at several fronts in effectively protecting the end consumers’ interests. Therefore, one common theme among all the granular level data collection programs being implemented across the globe is the regulators’ intentions of deriving analytical insights from gathered data that would help them form policies to not only enhance the financial markets’ efficiency & transparency but also protect the interests of consumers in the market.
THE RECURRENT DATA COLLECTION PILOT PROGRAM OF ASIC
The Australian Securities and Investments Commission (ASIC), Australia’s corporate, markets and financial services regulator; recently introduced its pilot program to collect Mortgage as well as Managed Funds’ transaction level data on a recurring basis.
This initiative to collect transaction level data from Mortgage lenders as well as Fund Managers aims to:
support its decision making on the basis of data-derived facts & evidences
stay ahead of the curve in understanding changing dynamics within financial industry
spot the emerging trends, risks as well as opportunities to regulate the market participants
protect interests of financial industry consumers
It is important to note that one of the motivations behind the introduction of such recurring data collection program by ASIC, is the recommendations provided in the Productivity Commission’s report that came out in August 2018.
The commission’s report pointed out various shortcomings in the nation’s financial markets including the diminished competition levels due to the unchallenged dominance enjoyed by big 4 banks along with major insurance firms. As per the commission, such unprecedented levels of low competition have often led to poor consumer experiences mainly due to lack of options to switch or selling of financial products that do not serve best of the interests of the buyers.
Furthermore, to address the lowering levels of competition in the financial markets where currently, the balances are heavily tilted in favor of very few players; the commission recommended that Australian Competition and Consumer Commission (ACCC) must act as a watchdog as well as an intervening regulator to promote the formation of fair playing field for all financial institutes, regardless of their own sizes or size of their market share.
The Productivity Commission (PC) in its August 2018 report, pointed out the need for regulators to immediately look beyond the regulatory reporting and drill-down to deal levels to spot anomalies that may point out compromises done on consumer’s best interests. The report also suggested for the product regulator ASIC to collect mortgage data from ADIs with a time gap of no more than 6 weeks and the same be assessed (for wrongdoings) and published on their website.
Apart from collecting mortgage lending data, ASIC also took note of the complaints received from investors about the vast differences realised in returns delivered vis-à-vis committed by the fund managers. And therefore, decided to collect and analyze data from the regulated fund management entities as well.
This recurring data collection pilot program commenced in June 2018 when ASIC sent out separate letters to the regulated entities asking for their nomination in volunteering to participate in the program.
It also detailed the timelines of the pilot program alongside information surrounding formation of support groups comprising of ASIC as well as industry professionals that will help it propel the program forward while overseeing the progress. The inspiration behind formation of such groups is to help ASIC benefit from industry expertise on data sharing and processing while simultaneously focusing on minimizing cost impact for the participants.
To achieve the mature state of data collection and utilization, ASIC has formed two dedicated support groups:
1. Strategic Reference Group
This group will act as ASIC’s voice while dealing with the industry participants and will also ensure resolution of issues while overseeing the program and maintaining the pace of it. This group comprises of industry, consumer & ASIC representatives.
2. Implementation Working Group
As the name suggests, this group will be working on the ground to ensure smooth and frictionless implementation of the pilot as well as underlying recurring data collection process. ASIC has onboarded (both internally as well as externally) professionals with varied projects, technology & data analytics experience.
Below is a snapshot briefly depicting the data collection endeavor of ASIC:
With the initiation of this pilot program, ASIC also plans to assess the feasibility & benefits of the recurring data collections. In order to enhance fruitfulness of the process while concurrently avoiding the scenarios involving ad-hoc data requests; ASIC has committed to carry out exhaustive groundwork with ongoing coordination with industry participants to finalise data delivery protocols and data requirements that not only suit the participants but also fulfil ASIC’s requirements in terms of data granularity.
Speaking of timelines, the regulator kicked-off the pilot by conducting workshops with volunteering participants in September 2018. ASIC also opened channels for ongoing dialogue with the help of strategic reference group. These coordination meetings with participants are scheduled to carry on till Q2, 2019 when first tranche of data is expected to be collected.
The regulator’s interest in collecting data at such unprecedented granularity levels is a welcome step for consumers in financial industry. This will certainly help addressing the lack of transparency in managed funds industry to address rising gaps between promised & delivered returns to the investors.
Considering the Royal Commission’s findings coupled with a state of uncertainty surrounding the real estate market, it is a welcome initiative for a regulator to look into the data at transactional level to gather insights and therefore, paint an informed & data-led picture of the industry that not only points out the potential risks but also indicates where the growth opportunities in the markets are.
However, one can also not ignore the additional pressure such data requirements will put on the financial industry. Especially, when it is already grappling with enhanced data requirement from other regulators such as APRA's EFS Reporting demanding implicit data quality commitments from financial institutions (FIs) and RBA’s enhanced requirements on securitisation data submission.
Needless to say that only those firms with an excellent data strategy and scalable infrastructure will be able to weather the storm and possibly come out unhurt when the regulatory tide settles down. Although, considering the number of reforms on the card for regulators including APRA capital reforms and the royal commission recommendations; challenging times for FIs won't be ending anytime soon.