top of page

APRA's quest for data quality. RPG702.

APRA, ABS and RBA have introduced RPG 702Data Quality for the EFS Collection to assist reporting entity in meeting data management and quality control expectations of the EFS reporting standards. Under the new EFS collection reporting entities must meet the data quality requirements on the systems, processes and controls to assure the reliability of the submitted data. EFS reports are being introduced in 3 phases in 2019 (read more about EFS here).

APRA expects the reporting entities to have best practices in place for managing data quality as per the principles defined in the Prudential Practice Guide (Managing Data Risk CPG 235). RPG 702 specifically focuses on the ABS/RBA Data Quality requirement of the EFS Collection. It outlines the accuracy of the data through quantitative measures, with the goal of assisting reporting entities to understand the quality control requirements and the magnitude of the reporting errors that may affect the agencies use of the data.

While having a strong data quality framework is the expectation of the agencies on all data submitted in EFS collection, RPG 702 gives a priority categorisation, with the benchmarks as an indicator for reporting entities to focus on quality management and control practices. The guide ultimately introduces three categories of priority: “standard”, “high” and “very high”, indicating the relative importance of the accuracy expectation by the agencies. A higher priority is given to the total and subtotal items, with the aim of identifying errors relevant to the internal consistency of the series and the errors that are likely to affect the industry aggregates. The following graph show the number of items prioritised within high and very high categories.

The guide provides flexibility for greater use of judgement by the reporting entity in identifying the items in breach of the prescribed benchmarks. Any items found above the benchmarks will be considered as “reporting errors", and the agencies expect that the reporting entity would notify APRA. Depending on the agencies assessment, reporting errors may potentially trigger further action steps to be taken by the reporting entity. These action steps may include APRA requesting the data to be resubmitted, as well as the expectation that the entity would review its data quality processes and controls. In addition, the agencies may seek further explanation of the root cause as well as the mitigation plan for future.

Not all the practices outlined in the RPG will be relevant for every reporting entity depending upon the size, complexity and system configuration of the entity. The guide gives the flexibility for the reporting entity to manage its EFS collection reporting in a manner that is best suited to the business. However, the agencies expect reporting entities to set a high importance on the quality of the data submitted in EFS Collection, denoting stronger scrutiny on data management and risk strategy.

It is becoming more relevant for the reporting entities to have a proper control and monitoring of the data issues along with well-developed processes to manage all stages of data issues including detection, identification, containment, investigation, resolution and adjustment of controls to reduce the risk of similar issues in future.

Furthermore, the data quality requirement of the guide put emphasises on having proper systems and processes which enable entities to perform data lineage in order to identify and explain the underlying facts driving the quality of the data.

Ultimately EFS collection is enforcing reporting entities to rethink their data management approaches and practices. It is important that reporting entities employ best practices in managing data risk and in particular:

  • Principles-based approach to data – data risk management is part of the systematic and formalised approach.

  • Risk appetite and controls – data risk is considered, and appropriate controls implemented at each stage of the data lifecycle including Data capture, Processing, Retention, Reporting and Disposal.

  • Data validation – key control for ensuring the quality requirements and assessment against fitness for use.

  • Monitoring and managing data issues – data quality benchmarks would be considered in monitoring and managing data issues relating to EFS collection.

  • Assurance – providing regular assurance that data quality is appropriate and data risk management is effective.

Addressing APRA’s expectations of data quality should not be a tick-the-box exercise. By implementing best practice data governance principles and processes, ADI’s and RFC’s will increase the value of one of their greatest assets: their data. Firms can unlock strategic value by leveraging their clean, reconciled data sets for internal analytics and management reporting, combining regulatory compliance with operational efficiencies and better-informed decision making.

About the author

Robert is a senior Information Management professional with over 10 years of experience at the Australian Prudential Regulation Authority (APRA). Within the Data Analytics department at APRA he was leading a team of senior Business Analyst responsible for Business Process Automation, Data Collections and Reporting Solutions, Data Analytics, Business Intelligence and Resource Management.

Robert worked on the implementation of APRA regulations across various regulated industries, including ADI, Super and Insurance. He has performed project management, business analysis and change management roles and led large technical teams on projects.

After 10 years’ service at the regulator, he is well equipped to support Australian regulated institutions; by delivering cutting-edge solutions that drive efficiencies, deliver insight and ensure regulatory compliance.

0 comments
bottom of page