How to Drive the Mortgage Industry Forward

How to Drive the Mortgage Industry Forward

Posted on February 17, 2016 0 Comments
stack of papers

Either your system allows users to easily get answers or it doesn’t. Do customers have to call a customer rep or can they do it themselves?

Today’s mortgage industry faces challenges undreamed of in the past. Regulatory requirements have skyrocketed and firms have been punished for not meeting them. Changes coming out of the financial crisis of 2008 are forcing firms to enhance their capabilities such as demonstrating proof of title, the history of the events that have occurred in the mortgage, on-site reviews of operational risk, and assessments of the effectiveness of internal controls. The changes never stop as regulations constantly evolve and Fannie Mae regularly updates the requirements in its Selling Guide, which is the bible for doing business with Fannie. Adapting to these new requirements is complicated by legacy IT infrastructures, which are poorly suited for the new world.

And customer service is costly and filled with inaccuracies. When mortgage customers ask for a history of activities, if the bank gives out an incorrect answer — it gets fined. To avoid fines, banks have multitudes of employees doing searches — which is costly and time-consuming. Effectively securitizing mortgages is ever more demanding and rethinking mortgage repositories is crucial.

This is the first in a series of blog posts that will explore the mortgage industry’s many data challenges and how firms can enhance their IT infrastructures to handle the new requirements and exploit new opportunities. I will delve into two variant approaches: to build a metadata repository and a mortgage document repository can overcome these types of issues, as well as the mechanics of building either.

Data Challenges: Silos & ETL

Data silos often force firms to perform multiple searches against multiple systems in order to answer a single query. Inconsistent data in silos mean that the answer to a query against one data source will be different than the answer from another source. Users may not even be aware of all the data that exists in its data sources, much less how to query it. Every new regulatory requirement from the Consumer Financial Protection Bureau, Federal Trade Commission, or one of the many other regulators changes the structures of documents feeding into a firm’s mortgage infrastructure and creates new ETL or modeling work to store and access them.

Documents Overload

The mortgage industry is necessarily complex. Mortgage loans and underlying properties are assets that are part of the asset-liability management (ALM) process. As such, accurate valuation of both the underlying assets, securities backed by the mortgages and accurately quantifying the loan default rates are essential to managing the balance sheet exposure.

Individual mortgages can be in operation for as long as 30 years — and sold over and over again. Originating, supporting and securitizing a mortgage requires a multitude of documents. The laws governing mortgages (and the documents needed to support them) vary from state to state and from year to year. Firms handling large numbers of mortgages need to handle potentially thousands of different types and versions of documents. In large firms, dozens of systems may be used to handle different aspects of the mortgage process. Mortgages are often securitized or sold to other firms.

And, at any time, a firm may need to pull together all the documents relating to a single mortgage to respond to a subpoena or other reason.

Many of the IT infrastructures used to originate, process, and securitize mortgages were developed when the mortgage industry was simpler. During that period, there was little need to have a consolidated view of all aspects of a mortgage’s complete lifespan for both risk mitigation and regulatory compliance. In fact, it often made sense to build a variety of siloed systems — as they could be developed faster and more easily by creating them independent of each other.

Querying Across Silos

Because those legacy systems were often built on relational databases, which require fixed and rigid schemas, it can be difficult to integrate large numbers of existing — and even new — data sources. The ETL effort needed to bring the data into conformance with existing relational schema may require months of effort and several days to load each new data set.

With different systems containing different types of documents it can be extremely difficult to query all the documents as an integrated whole. To get around this, firms often serially query multiple data stores and write code to integrate them. Firms also create limited sets of duplicate data to make querying easier. The challenges are, querying across multiple data stores chews up processing, while duplicate data sets are often built with little governance. This leads to inconsistent data sets with each replica having its own update policies and each replica doing its own transformation and enrichment of the data.

This jungle of data is difficult to secure (sometimes it may exist on spreadsheets) even though it contains critical customer information.

Business Challenges

These technical issues lead to a variety of business challenges. These include:

  • Data Quality Issues
  • Increased Risks
  • Limited Analytics
  • Slow and Inferior Decisions
  • High Costs
  • Limited ability to detect fraud
  • Inferior pricing of mortgages

Building an Infrastructure for Today and Tomorrow

In this series of blog entries we will discuss how to build an IT infrastructure that avoids the technical issues and allows firms overcome the business challenges. Specifically, we will examine how to build a Metadata and/or Mortgage Document Repository. Specifically we will be talking about the creation of a “hub” that provides:

  • A single source of truth — consolidated view of ALL the information relating to its mortgages
  • Quick access to new data and new data sources — minimal ETL.
  • Support for new data governance and regulatory demands
  • Enterprise data integrity, reliability, and security
  • A complete history of each mortgage
  • Ability to handle many billions of documents
  • Geospatial queries

Next up: How do you move from the manual processes, errors, and inefficiencies of today’s overlapping and unmanageable data silos to a single, unified golden source repository? How do you do it while keeping your business running and protecting possibly decades of investment and work? Find out in our next blog: Building the Repository.

David Kaaret

David Kaaret has worked with major investment banks, mutual funds, and online brokerages for over 15 years in technical and sales roles.

He has helped clients design and build high performance and cutting edge database systems and provided guidance on issues including performance, optimal schema design, security, failover, messaging, and master data management.

Comments

Comments are disabled in preview mode.
Topics

Sitefinity Training and Certification Now Available.

Let our experts teach you how to use Sitefinity's best-in-class features to deliver compelling digital experiences.

Learn More
Latest Stories
in Your Inbox

Subscribe to get all the news, info and tutorials you need to build better business apps and sites

Loading animation