Here's the one thing you need to know: You can get a valid deposit study even if you don't have the data on hand.
Thanks to a change in regulatory priorities, I now get more questions about deposit studies than any other topic. And one that just keeps coming up is "What do I do if I don't have the data?"
If this is you, I've got good news. You may need to have someone at the bank do a bit of grunt work, but I will show you how to recover sufficient data to complete a basic deposit study.
I recently spoke with a banker that was getting some regulatory pressure to develop a bank-specific deposit study to support their NMDA assumptions in their interest rate risk model. The bank has been around for several decades so I expected they would have all the data needed in a nice neat IT report or extract from their core system.
There was just one problem. The bank had done a core conversion in 2012 and failed to keep the old data. This is a very common problem. I see this all the time.
The bank thought they were out of luck. In reality, they just needed to look at the situation differently.
As bankers, we've been trained to be decision makers. And the way that happens is that we associate certain behaviors or circumstances with particular outcomes. Using this frame of reference allows us to quickly make decisions based on our experiences.
Typically, that's a good thing. But it can also blind us to alternative solutions.
The way this trait shows up with deposit studies is that if we are familiar with them at all, it's usually from the perspective of having heard or experienced what I call a static pool deposit study.
Static pool relies upon tracking the ongoing changes in the same specific individual accounts (the "static pool"). These changes are tracked, for these select individual deposit accounts, month after month, year after year. This is done to develop a simple model of the lifecycle of your NMDAs.
So based upon the static pool, we need lots of IT help, monthly data and risk exposing the bank to privacy issues based upon individually identifiable accounts. Plus we run the risk of tracking the wrong account vintage years and ending up with a bad study.
The dynamic pool method, on the other hand, uses the change in all existing balances to derive the decay rate associated with your NMDA account balances. It has the advantages of being easy to calculate and understand, does not require extremely difficult IT extracts, and does not involve personally identifiable account information, so privacy concerns are minimized.
And best of all, it lets us use a different data approach to creating our deposit study.
The first secret to this approach is all about statistical significance and how we analyze our data. Here's the rule of thumb that it will all pivot upon:
You typically need about 55 data points to get a statistically significant result.
As the data points increase, our statistical relationships tend to strengthen. And sometimes you get lucky and can obtain a valid set of relationships with fewer data points.
That's a little more than 4.5 years with monthly data....But it's almost 14 years with quarterly data.
Just keep in mind that data is data, whether it's daily, monthly, quarterly or annually. And statistics don't care if your data is in any of these periodic terms. It simply needs some consistency in format.
So here's the lucky break for all you community banks that are worried about not having your data. It just so happens that FDIC and FFIEC retain all of your Call Reports that have been filed since 2001 online.
That's right, just exactly the number of quarterly data observations you need to be pretty much guaranteed to develop a statistically significant set of results.
This is really great timing because even if you have no data on hand, you can use the old Call Reports to extract the line item level data needed to do a basic deposit study.
I understand that recovering this data is a mind-numbing menial task, but you can get it done, and it won't cost you a penny. And it's better than facing your regulator with the news that you just won't be able to comply with their request for you to develop inhouse bank specific NMDA assumptions. But keep reading for another big insight.
The second big surprise is that you can mix and match data frequency.
So in the case of my community bank friend mentioned above, they have about 2 years of monthly data available via the new core system. So all they need to do is to combine those 24 observations with about 7 years of quarterly data and voilà they have all the data they need.
Finally, here's the third under-appreciated fact about deposit study modeling.
The data needed for modeling your decay rates (determining average life) is much less sensitive than the data needed for determining your beta (rate sensitivity) factors.
So even if you have less that the "ideal" minimum of 55 data points, it's worth taking a look to see what the data tells us.
You may find that your smaller data set is sufficient to get you started with a basic deposit study. Maybe you'll be able to only determine average life, but have less success with beta. It's still progress.
And the truth is you have to start somewhere, so why not here and now?
If you're interested in learning more, please let me know. Together we can overcome this challenge.
Photo provided by Steven Depolo