Our new training workshop series is getting started with a blockbuster ALCO and asset liability agenda September 15 - 19.
Click to learn more.
Join me as we host some very special guest expert instructors. In fact, our faculty is a veritable "Who's Who" of interest rate risk, including:
- Dave Koch of Farin Associates
- Mark Haberland of Darling Consulting Group
- Paul Allen of Saltmarsh CPAs
Our topics include almost every important and timely ALCO issue on your to-do list including...
- Interest Rate Risk Peer Data
- Model Validation
- Deposit Analytics and Core Funding
- ALCO Strategies for Success
- IRR Contingency Planning
You can imagine what you might be charged for 5 days of content and expertise like this. But here's the best news. We're all working together on a mission to help community bankers and we're going to bring this important workshop series to you at no charge.
That's right. It's free to attend all 5 live sessions.
So bring your management team and your directors.
There are just 2 requirements.
- Community bankers only. No consultants or vendors.
- You must register using your bank email address.
Also, seats are limited so sign up soon.
Click to learn more and to register for this groundbreaking multi-day ALCO training event.
Please let me know if you have any questions.
Photo provided by Ramesh NG
Over the past few months I've had the opportunity to deliver live training workshops to over 1,000 community bankers nationwide. And I've personally seen just how hungry community bankers are for good quality training.
In fact, I've had several bankers ask me about training topics just this week.
Well, here's a special "heads up" for you. I'm working on a great training workshop series including...
- Important and timely items of interest to community bankers
- Highly recognized expert faculty from throughout the industry
- Tips, tricks and pointers you can use in your bank
- Board of Directors training topics
- All content and no selling
Our first workshop session is going to focus on risk management, specifically interest rate risk and asset liability management. And we'll have the best experts and authorities in the business as our distinguished faculty.
Subsequent series will focus on...
- Investment portfolio management
- Stress testing
- Capital planning
- Loans and deposits
- Strategic planning
- Performance improvement
- And so much more!
I'll have more on the workshops soon.
Until then, please let me know if you have any particular workshop topics you might want to see covered.
Photo provided by Ramesh NG
Comments on community bank performance and regulation by Minneapolis Fed President Narayana Kocherlakota should be reviewed by all interested in community banking. Read the speech here.
Kocherlakota makes 4 main points on the state of community banking.
Community bank recovery in asset quality has been strong.
Lagging earnings and loan growth raise questions about the cost of new and enhanced regulation.
Low earnings combined with higher compliance costs raise concerns about community bank consolidation.
As a matter of public policy Kocherlakota supports tailoring supervision and regulation to reflects the risks and roles of community banks.
Kocherlakota goes further and offers two specific ways regulation could be further tailored in the future.
Congress and supervisors should exempt all community banks from certain regulations. In fact, he goes so far as to state that "Exempting is the best way to guard against regulatory trickle-down."
Narrow the focus of current supervisory methods that are too detailed across too many areas and apply to too many banks.
Instead, he suggests that regulators concentrate on the small handful of activities that are correlated with bad results.
Rapid loan growth.
High lending concentrations
Specific high-risk types of lending.
Specific wholesale funding strategies
These types of common sense approaches to overly severe regulation should be supported by all community bankers. In our society, the way we make our support known is through our elected officials.
Please call on your representatives to support even handed community bank regulation that focuses on actual risks associated with community banking, and not on nontraditional activities that larger TBTF banks pursue.
Photo provided by Joseph Friedrich
If you've read my NMDA deposit study materials you know I'm a big believer in thinking "average life" anytime someone says "decay rate". To me, average life is just a simpler concept where we already have a good frame of reference.
So when a reader recently wrote in discussing how to calculate average life I was secretly very pleased. No need getting bogged down in decay rate when average life is what we really want.
It seems that a consultant had taught this reader to calculate average life in a very simple and straightforward way. Here's what he said...
"To calculate our decay speed all we need to do is take all of our closed accounts and determine how long they were opened to determine average life."
I have very mixed feelings about this comment.
First of all, I really like the simplicity of this approach. After all, I'm all about taking complex things and making them simpler.
Second, on a very high level, this conceptually captures what we're trying to accomplish which is to see how long our accounts tend to stick around.
But here's where I run into a problem. We don't really care about accounts nearly as much as we care about dollar balances.
Consider your dormant or near dormant accounts. Do you really care about when they actually close, or do you care about when the balances run down from 100% to 5% or less?
Or think about the common situation where you might have a very few high dollar accounts and lots of smaller accounts. Do you really want to model your few high dollar accounts based upon what happened with the great multitude of low balance accounts?
Ultimately, what I'm saying here is that if you focus on when accounts close you miss the pattern of decay associated with when account balances run down.
I'd guess that account balances tend to run down faster than accounts are closed.
Why is this important?
Because if you focus on account closing dates, you're much more likely to overestimate your average life. And overestimated NMDA average lives leave you open to both unexpected interest rate risk and regulatory criticism.
Let me know if you're ready to document your bank's NMDA average life and beta numbers.
Photo provided by Colin
Here's the one thing you need to know: You can get a valid deposit study even if you don't have the data on hand.
Thanks to a change in regulatory priorities, I now get more questions about deposit studies than any other topic. And one that just keeps coming up is "What do I do if I don't have the data?"
If this is you, I've got good news. You may need to have someone at the bank do a bit of grunt work, but I will show you how to recover sufficient data to complete a basic deposit study.
I recently spoke with a banker that was getting some regulatory pressure to develop a bank-specific deposit study to support their NMDA assumptions in their interest rate risk model. The bank has been around for several decades so I expected they would have all the data needed in a nice neat IT report or extract from their core system.
There was just one problem. The bank had done a core conversion in 2012 and failed to keep the old data. This is a very common problem. I see this all the time.
The bank thought they were out of luck. In reality, they just needed to look at the situation differently.
As bankers, we've been trained to be decision makers. And the way that happens is that we associate certain behaviors or circumstances with particular outcomes. Using this frame of reference allows us to quickly make decisions based on our experiences.
Typically, that's a good thing. But it can also blind us to alternative solutions.
The way this trait shows up with deposit studies is that if we are familiar with them at all, it's usually from the perspective of having heard or experienced what I call a static pool deposit study.
Static pool relies upon tracking the ongoing changes in the same specific individual accounts (the "static pool"). These changes are tracked, for these select individual deposit accounts, month after month, year after year. This is done to develop a simple model of the lifecycle of your NMDAs.
So based upon the static pool, we need lots of IT help, monthly data and risk exposing the bank to privacy issues based upon individually identifiable accounts. Plus we run the risk of tracking the wrong account vintage years and ending up with a bad study.
The dynamic pool method, on the other hand, uses the change in all existing balances to derive the decay rate associated with your NMDA account balances. It has the advantages of being easy to calculate and understand, does not require extremely difficult IT extracts, and does not involve personally identifiable account information, so privacy concerns are minimized.
And best of all, it lets us use a different data approach to creating our deposit study.
The first secret to this approach is all about statistical significance and how we analyze our data. Here's the rule of thumb that it will all pivot upon:
You typically need about 55 data points to get a statistically significant result.
As the data points increase, our statistical relationships tend to strengthen. And sometimes you get lucky and can obtain a valid set of relationships with fewer data points.
That's a little more than 4.5 years with monthly data....But it's almost 14 years with quarterly data.
Just keep in mind that data is data, whether it's daily, monthly, quarterly or annually. And statistics don't care if your data is in any of these periodic terms. It simply needs some consistency in format.
So here's the lucky break for all you community banks that are worried about not having your data. It just so happens that FDIC and FFIEC retain all of your Call Reports that have been filed since 2001 online.
That's right, just exactly the number of quarterly data observations you need to be pretty much guaranteed to develop a statistically significant set of results.
This is really great timing because even if you have no data on hand, you can use the old Call Reports to extract the line item level data needed to do a basic deposit study.
I understand that recovering this data is a mind-numbing menial task, but you can get it done, and it won't cost you a penny. And it's better than facing your regulator with the news that you just won't be able to comply with their request for you to develop inhouse bank specific NMDA assumptions. But keep reading for another big insight.
The second big surprise is that you can mix and match data frequency.
So in the case of my community bank friend mentioned above, they have about 2 years of monthly data available via the new core system. So all they need to do is to combine those 24 observations with about 7 years of quarterly data and voilà they have all the data they need.
Finally, here's the third under-appreciated fact about deposit study modeling.
The data needed for modeling your decay rates (determining average life) is much less sensitive than the data needed for determining your beta (rate sensitivity) factors.
So even if you have less that the "ideal" minimum of 55 data points, it's worth taking a look to see what the data tells us.
You may find that your smaller data set is sufficient to get you started with a basic deposit study. Maybe you'll be able to only determine average life, but have less success with beta. It's still progress.
And the truth is you have to start somewhere, so why not here and now?
If you're interested in learning more, please let me know. Together we can overcome this challenge.
Photo provided by Steven Depolo
Matters Requiring Board Attention (MRBA) trends are a hot topic in the latest FDIC Supervisory Insights. Read the full publication here.
Over the period 2010 - 2013 the FDIC's MRBA comments on the Report of Examination (ROE) were collected, categorized and statistically aggregated. This summary data forms the basis of the article.
The most common areas ("Top 5") mentioned in MRBAs, along with key areas of specific mention, are listed below. Keep in mind these are typically for "1" or "2" rated institutions.
- Loans (Cited in 69% of all MRBAs)
- Board or Management (45%)
- Violations (24%)
- Earnings (24%)
- Interest Rate Risk (24%)
Loans focused on credit administration, problem assets, ALLL deficiencies and concentrations. Over the period of the study (2010 - 2013) the loan category has been declining in frequency, although it is still by far the most common category.
Board / Management comments focused on policies, audit, strategic planning and succession planning.
Violations typically involved appraisals and/or insider lending practices.
Earnings is primarily concerned with strategies to safely improve earnings.
Interest Rate Risk mentions have been on the rise since 2010. This should be no surprise as numerous regulatory releases and attention since then have targeted interest rate risk. Eventually rates will rise and your regulators' concerns will be realized. The time to prepare is now.
Generally, citations for interest rate risk have involved improved monitoring and control of the IRR process.
Liquidity didn't make the "Top 5", but deeserved special mention for declining in importance over the period. Again, as the crisis passes this is expected behavior.
IT issues on the other hand were not in the "Top 5" but have increased in frequency over time. Watch this emerging category.
On Friday July 25, the FDIC was named receiver for GreenChoice Bank (IL).
A link to our bank rating report is shown below.
As is typical, an examination of the financial position of this bank indicates well below normal capital levels and generally above normal noncurrent loans.
You can learn more about our bank ratings system, including video tutorials, on our website.
Photo provided by Nina Matthews Photography
My recent post about using Held to Maturity (HTM) to help address interest rate risk in the investment portfolio has hit a few nerves. I know this because I've quickly received some emails back questioning different aspects of the idea.
I love it when my readers respond back with questions. First of all, it means they are reading what I wrote. Second, it shows they are really thinking about the nuances of applying techniques and what it might mean for their bank.
Idea leads to questions followed by answers, and then the whole process repeats itself. My daughter, who studies the classics, would call it the Socratic approach.
I just call it proof that community bankers are working hard to do right for their institutions.
So anyhow, on to the EVE question. My reader writes...
"In terms of EVE, the actual value of the HTM securities should be determined by discounted cash flows. Despite the HTM accounting designation, the value of those securities would decline in an instantaneous upward shock."
Not only do I love it that this reader took the time to think about EVE, I'm also flattered that they wrote to me seeking further info. That's a win / win scenario.
Now, here's the important part. My reader is exactly right.
In the EVE analysis, the HTM security values will be determined by discounting the asset cash flows. And, in an upward shock the value of the HTM portfolio would indeed be expected to decline.
Just like the AFS portfolio. Only without that messy reduction to the AOCI capital account.
But that's enough theory. Let's look at a practical application.
Using March 2014 data from the FDIC's SDI database, focused on banks less than $1 billion asset size, we can sketch out the rough parameters of the average bank's balance sheet. It looks a lot like this:
Securities 24% Deposits 89%
Loans 65% Other 1%
Other 1% Capital 10%
Total Assets 100% Total L&NW 100%
To make the math easy, let's just assume this bank has $100mm assets. That means they have a $24mm securities portfolio.
So if as much as 10% of the bank's securities fit my earlier description of 4%+ long munis then they would be talking about "risking" a whopping $2.4mm move into HTM. Hardly an earth-shaking portfolio reallocation.
And if they took it to the extreme like those big banks have and put a full 20% into HTM, even if we round up we're only talking about $5mm total.
I dare say that's not going to put many banks over the edge.
And if it would strain your bank to commit somewhere between 2% and 5% of assets to HTM, or if you have other asset quality, liquidity, interest rate risk, or capital issues then just say "No" and look for another alternative.
But for the vast majority of well capitalized and well run banks, moving a small security position that we've already decided we're going to keep for the long term (funded with dedicated term funding...don't forget that piece) to HTM is pretty much going to be a nonevent.
It's just that we've been trained to not do this. That's what makes it so hard. It's always hard to break old habits.
So anyhow, back to the EVE specifics. Let's review a few base assumptions.
1) We assume the discounted cash flow approach is relatively accurate in estimating security values.
2) We assume that both an AFS or HTM security were added at 100 or par.
Then the fact that the muni is AFS or HTM is not going to change your EVE calculation at all. Base EVE is the same, and the rate shocked EVE will not significantly vary between AFS or HTM.
But what about if the security has already suffered some price deterioration? In that case we're depending on the book - to - base adjustment to initial EVE to accurately capture the price decline in the HTM security. The AFS price decline is automatically captured.
Again, no real difference exists.
Don't let the fear of the unknown keep you from using all the tools available to you.
Just be sure to use them in a thoughtful and responsible way, consistent with your bank's overall size, complexity and risk position.
And please keep those questions and comments coming. It helps us all becomes better bankers to thoroughly discuss and debate these ideas.
Photo provided by Ben Crowe
I've got good news for community bankers trying to balance higher interest rate risk on investment securities with the immediate hit to income associated with risk reduction.
But before I share this "new" strategy, let's review the interest rate risk regulatory environment.
We're all familiar with recent warnings from OCC and FDIC about the interest rate risk in our securities portfolios...especially longer duration securities.
And these warnings come with good reason. After all, unrealized losses in our available for sale (AFS) portfolio reduce our AOCI capital account and can cause increased regulatory scrutiny. For instance, consider this FDIC warning from earlier this year...
"More troubling to bankers is the prospect that regulators will likely be downgrading bank capital adequacy and liquidity ratings despite the fact that unrealized losses on securities will, for the most part, not directly reduce regulatory capital.
This may occur as a precautionary measure in advance of banks that may be forced to mark down GAAP capital or in the event that depreciated securities must be sold transforming unrealized losses into realized losses."
Not wanting to be left out of all the regulatory fun, OCC weighed in on the topic as well.
So, now that we're properly warned and we can practically see that examiner focusing on our securities portfolio, let's look at our "solution". In this case, we can simply mimic our larger TBTF banking brethren.
So what's the groundbreaking strategy?
Use Your Held to Maturity (HTM) Portfolio
As of March 31, 20 out of 25 (80%) of the largest banks in the US (including all of the "Top 10") are using their HTM portfolio. And they're not just putting a few securities in here and there.
In fact, these 20 banks are averaging a full 19.6% of their aggregate $2 TRILLION portfolios in the held to maturity category.
Now I know we've all been scolded for the past 20 years about how it's taboo to use HTM, but instead of worshiping at the altar of the AFS, let's take this "Big Bank Strategy" and turn it into something we can use in our community banks.
In fact, let's make it something we can explain and defend when we talk to our regulators. Because if all we do is put some bonds in HTM we won't escape the regulators' scorn.
We're not going to sell our 4%+ long munis, so we might as well adopt a rolling strategy of dedicated term funding that lets us lock in some profits while we wait for clearer direction on interest rates.
So let's turn it into a practical strategy. One we can use at our community bank and not an ivory tower strategy we need to use calculus to explain.
Here's the idea:
- Take some long 4%+ munis
- Fund with intermediate term 2% FHLB Advances
- Lock in several years of 2%+ ROA in a 1% ROA environment
- As the funding rolls toward maturity, book your profits and reevaluate
The key to making this work is the combination of dedicated term funding while we wait for a clearer picture of where rates might go. Most of all, discuss this in the boardroom, document it fully, and prepare minutes to show your intent and action.
I'm not sure what you think, but if we get a big rate move, I'm not so sure that muni yields will be tracking up alongside the overall market.
Now, let's be clear. This is not a strategy for every bank. If you have other capital, liquidity or IRR challenges, you might want to go slowly.
Similarly, we're really only talking about a specific use case where you have a security (in my example long munis) that you have really already determined that you are indeed going to hold to maturity.
To the extent that you have the desire, intent and capability to hold this security to maturity, then why shouldn't you use HTM, particularly when you combine it with dedicated funding and informed board approval?
Photo provided by PhotoSteve101
It's no surprise given regulatory pronouncements that community bank deposit studies are a hot button topic with regulators. What is a surprise is just how far down the foodchain the regulators are pushing this concept.
I understand the need for bank-specific deposit assumptions. You know, non maturity deposit decay rates and beta assumptions.
Bank-specific assumptions are always preferable, especially with NMDAs where small assumption changes can drive big reported interest rate risk sensitivity changes.
But I have recently had several clients less than $50mm in assets tell me their regulator has demanded bank-specific deposit assumptions from them. I think that's overkill.
There are only 2 groups that community banks can find themselves in right now:
- Their regulator has already demanded a bank-specific deposit study; or,
- Their regulator will soon demand that they produce bank-specific deposit assumptions.
So what's a community bank to do? I think you should prepare to create your own deposit study.
Here are a few pointers to get you started on the right path.
Static Pool Method: Static deposit study analysis relies upon the selection of a beginning year cohort set of deposits, and tracks the behavior of these specific accounts over time. Because the static method requires identifying and tracking specific accounts over time, the static method requires more extensive IT resources and involves a greater risk of privacy issues, due to the specific account identification required.
The static approach delivers results that illustrate the year-by-year decay for a finite set of accounts opened during one specific cohort year. Results attributable to any specific cohort are likely to vary greatly from year to year.
Dynamic Pool Method: The dynamic pool method uses all the change in all existing balances to derive the decay rate associated with your NMDA account balances. It has the advantages of being easy to calculate and understand, does not require extremely difficult IT extracts, and does not involve personally identifiable account information, so privacy concerns are minimized.
For almost all community banks, I recommend the static pool methodology.
Determine Level of Detail
You must determine the level of detail you will need for your assumptions.
a) I suggest at least separate analysis for Non-Interest Bearing Checking, Interest Bearing Checking, Money Market, and Savings accounts.
b) If you have tiered accounts with significantly different rates, consider adding additional levels of detail in the future.
Just keep in mind that a simpler study that is completed and documented beats a more sophisticated study that remains a theory or "what-if" project. There's always time to extend your study to a more detailed perspective later.
For each type of account you wish to study, you will need...
a) Monthly history of total balances per that deposit class.
b) Monthly history of new account balances per deposit class.
c) Rate history for each deposit account type being studied.
d) Monthly history of total time account balances.
e) Monthly history for total bank assets.
Note: Almost no one has the data complete when they start a project like this. For the first iteration, work with what you have and collect more data for better future results.
Also, no account specific or personalized identifiable data is needed, so there are no additional privacy issues.
Deposit Study Steps
From these balance and rate histories, make progress in the following fashion.
1) Identify and segregate any surge balances. Surge balances will be subject to a more aggressive assumption on decay.
2) Identify balances not sensitive to rates.
3) Calculate decay rates on remaining core balances that are rate sensitive.
4) Calculate and document the rate sensitivity (beta) of each deposit type or category studied.
Over time, as additional data is collected, the analysis will be updated, refined and perhaps expanded as needed.
There are really 3 ways that you can approach a community bank deposit study project.
1) Take the blueprint above and work it out on your own.
2) Let me teach you how to do it in a more detailed training focused on each of these steps. At the end of the training, if you have been doing the "homework" along the way, you will end up with a functional and documented assumption process.
3) Give me the data and let me do it for you.