For at least thirty years it seems the mortgage industry has pursued a recurring theme. The battle for market share and short-term profits has led to IT dollars pouring into systems for quickly processing and selling or securitizing loans, and servicing loans. Unfortunately, controls on data integrity have not garnered an appropriate share of IT budgets. The result is that wonderful systems are populated with less than wonderful data. The fallout is manifested in misevaluated risks, volatile bond prices, misguided collection strategies, extensive litigation and damages, and heightened government involvement, scrutiny, and fines.
Data integrity problems reportedly cost U.S. businesses $600 billion dollars annually. Recent significant losses can be traced directly to bad data quality. Prior to 2006, the industry primarily relied on enforcing lender representations and warranties. Recent widespread litigation has shown the immense costs of relying on cures. Prevention is a better solution. As a result, quality initiatives are taking place in the market, and the GSEs, investors, and regulators are demanding increased data integrity.
“Data integrity encompasses the completeness, consistency, and accuracy of captured and reported data”, stated Michael Trickey, Managing Director of The Berkshire Group. “This includes data collected on loan applications, during underwriting, upon funding, in secondary and securitization activities, and during servicing and surveillance activities.”
Each step or process generates a wealth of information to be either collected accurately, or perhaps inaccurately or not at all. Businesses can reap huge rewards by determining what data should be collected at each step, and putting in place electronic and human systems to ensure data integrity is achieved and maintained. They can also help avoid regulatory, legal, and reputational problems and costs by capturing the right information for maintaining controls over lending, selling, securitizing and servicing activities. Quality data supports business decisions and aids compliance.
On the business side, data supports the use of predictive models used for 1) designing products and underwriting standards, 2) pricing loans and MSRs on a risk-adjusted basis, 3) implementing marketing strategies, 4) setting collection campaigns and predictive dialer strategies, 5) developing loss mitigation criteria and contract terms, 6) setting bond levels and support requirements, and 7) performing surveillance. If the data used for building and calibrating the models does not have integrity, the models will be faulty.
On the compliance side, quality data helps to detect and head off problems. This role has taken on even greater focus with the Consumer Financial Protection Bureau’s (CFPB’s) mortgage servicing rules effective January 10, 2014. These rules amend the Real Estate Settlement Procedures Act (Regulation X) and the Truth in Lending Act (Regulation Z) and impose stringent compliance audits and fines for non-compliance.
The industry has little in the way of standardized data criteria, definitions, and formats. Even within firms, data integrity may be challenged by legacy systems and employees, competing department or business unit definitions, lack of focus and definition from management, failure to keep up with changing concepts and regulatory definitions, or a host of other issues.
So how can lenders accomplish a comprehensive data integrity and quality program? The solution involves not just technology, but also proper business processes and workflows. From a technology standpoint, the solution appears to be simple. Although, applying just technology becomes a little more challenging in today’s heavily integrated environments.
Any solution must have a two- step workflow, 1) a system and 2) experienced analysts to revalidate the data, flag changes, identify errors and provide opportunities to fix those errors, based on confirming accuracy to the actual loan file. This means having a system and procedures in place to reconcile data throughout the loan file and recognize discrepancies and errors that may occur, and having procedure to quickly act and repair the data.
Firms often use third party testing of data integrity and overall origination, sale, servicing practices to vet out and resolve potential problems and to circumvent natural internal rivalries that could reduce effectiveness of testing.
Testing frequency is tied to the severity of known problems, although at least quarterly. Tracking mechanisms of data flows are critical and should be documented to facilitate quick corrections and to avoid repeated mistakes in the future.
In a proper data integrity review, loan files should be bookmarked and documents inventoried. A document sufficiency review ensures pertinent documents are included and that the information in the documents is consistent and logical. This review is dependent on the documents each firm requires for its ultimate portfolio, disposition, or exit strategy. If documents change, resulting in data changes, there must be systems in place to ensure the impact of those changes are appropriate and the risks associated with document changes to a loan are fully understood.
No matter what, do not underestimate the data quality problem, nor the effort required to resolve it. Get in front of data integrity problem sources before they can undermine all areas that rely on the data.