It's no surprise given regulatory pronouncements that community bank deposit studies are a hot button topic with regulators. What is a surprise is just how far down the foodchain the regulators are pushing this concept.
I understand the need for bank-specific deposit assumptions. You know, non maturity deposit decay rates and beta assumptions.
Bank-specific assumptions are always preferable, especially with NMDAs where small assumption changes can drive big reported interest rate risk sensitivity changes.
But I have recently had several clients less than $50mm in assets tell me their regulator has demanded bank-specific deposit assumptions from them. I think that's overkill.
There are only 2 groups that community banks can find themselves in right now:
- Their regulator has already demanded a bank-specific deposit study; or,
- Their regulator will soon demand that they produce bank-specific deposit assumptions.
So what's a community bank to do? I think you should prepare to create your own deposit study.
Here are a few pointers to get you started on the right path.
Static Pool Method: Static deposit study analysis relies upon the selection of a beginning year cohort set of deposits, and tracks the behavior of these specific accounts over time. Because the static method requires identifying and tracking specific accounts over time, the static method requires more extensive IT resources and involves a greater risk of privacy issues, due to the specific account identification required.
The static approach delivers results that illustrate the year-by-year decay for a finite set of accounts opened during one specific cohort year. Results attributable to any specific cohort are likely to vary greatly from year to year.
Dynamic Pool Method: The dynamic pool method uses all the change in all existing balances to derive the decay rate associated with your NMDA account balances. It has the advantages of being easy to calculate and understand, does not require extremely difficult IT extracts, and does not involve personally identifiable account information, so privacy concerns are minimized.
For almost all community banks, I recommend the dynamic pool methodology.
Determine Level of Detail
You must determine the level of detail you will need for your assumptions.
a) I suggest at least separate analysis for Non-Interest Bearing Checking, Interest Bearing Checking, Money Market, and Savings accounts.
b) If you have tiered accounts with significantly different rates, consider adding additional levels of detail in the future.
Just keep in mind that a simpler study that is completed and documented beats a more sophisticated study that remains a theory or "what-if" project. There's always time to extend your study to a more detailed perspective later.
For each type of account you wish to study, you will need...
a) Monthly history of total balances per that deposit class.
b) Monthly history of new account balances per deposit class.
c) Rate history for each deposit account type being studied.
d) Monthly history of total time account balances.
e) Monthly history for total bank assets.
Note: Almost no one has the data complete when they start a project like this. For the first iteration, work with what you have and collect more data for better future results.
Also, no account specific or personalized identifiable data is needed, so there are no additional privacy issues.
Deposit Study Steps
From these balance and rate histories, make progress in the following fashion.
1) Identify and segregate any surge balances. Surge balances will be subject to a more aggressive assumption on decay.
2) Identify balances not sensitive to rates.
3) Calculate decay rates on remaining core balances that are rate sensitive.
4) Calculate and document the rate sensitivity (beta) of each deposit type or category studied.
Over time, as additional data is collected, the analysis will be updated, refined and perhaps expanded as needed.
There are really 3 ways that you can approach a community bank deposit study project.
1) Take the blueprint above and work it out on your own.
2) Let me teach you how to do it in a more detailed training focused on each of these steps. At the end of the training, if you have been doing the "homework" along the way, you will end up closer to a functional and documented assumption process.
3) Give me the data and let me do it for you.