Faced with the double-edged sword of a rising rate environment and a shrinking number of borrowers, pricing strategy is critical to origination success. Finding ways to maintain production levels, while not giving away the farm on margin, is top of mind for all originators as we approach the second half of the year.
Optimal Blue, a division of Black Knight, has seen that the clients most successful in this area are those that have incorporated data as part of their operations. This approach usually starts with sponsorship from the top. Heads of lending and capital markets, CFOs, and CEOs must all buy into a data-driven approach for decision-making. Similarly, it’s necessary to invest in resources to manage with and analyze data. That could mean hiring an analyst or allowing staff on the pricing or secondary teams to dedicate material time to data analysis.
The other side of the coin is finding dependable data assets. Even with a great team and a well-designed framework for integrating insights, you can still run into the “garbage in, garbage out” trap that all data analysts, statisticians and scientists try to avoid. Finding and investing in quality information is a fundamental part of a thoughtful data strategy.
In the mortgage industry, there are three primary areas of focus when selecting a mortgage pricing data source:
Representativeness
There is no such thing as a mortgage dataset that covers 100% of the market. Thankfully, alternatives that cover a representative sample of the market can be relied upon as an indicator of what the whole market is doing. Optimal Blue’s Mortgage Lock Data, for example, is one such dataset. As an output of the industry-leading Optimal BlueSM Product, Pricing and Eligibility (PPE) Engine, part of Black Knight’s extensive solutions suite, this dataset captures roughly 40% of all rate locks for new residential originations. Sourced from over 900 locking lenders, Optimal Blue Mortgage Lock Data has proven to be highly representative of the mortgage market.
Granularity
The second component of a quality data asset is the granularity of the information included. In other words, does the data allow you to dig deeper to find insights? For example, let’s say you run pricing for a lender who operates primarily in Northern California, or even just the Bay Area, and you want to understand what pricing and volume trends look like for your markets. If your data only goes as deep as the state of California, you may be misled by the data results, which could be skewed by the inclusion of markets beyond your focus. To that end, it’s critical that your data assets are detailed enough to segment for the products and markets you’re serving.
Timeliness
Finally, timing is critical for any data intended to be actionable. As we are all acutely aware, particularly over the last 24 months, the market can shift rapidly. If you are making operational decisions – such as margin setting or pricing – based on outdated information, you will be flying blind. Be sure your datasets are delivered frequently and in a timely fashion.
The timing of Optimal Blue’s Mortgage Lock Data, for example, is based on the date of the rate lock, which is one of the earliest trackable points in the life of a loan. Using rate lock data can provide as much as 60–90 days’ advanced look into mortgage trends when compared with more dated alternatives, such as closed loan datasets. Finally, Optimal Blue’s Mortgage Lock Data is delivered daily, so clients can operate with the most current view of market color for pricing, volume and lending composition.
A data-driven approach to decision-making can be the key to success in competitive, volatile markets. Don’t compromise your pricing strategy by leveraging data sources without representativeness, granularity and timeliness.
To learn more about how Optimal Blue’s Mortgage Lock Data can support your business, email Sales@OptimalBlue.com.