Yulia Demyanyk – Iftekhar Hasan
Financial crises and bank failures:
a review of prediction methods
Bank of Finland Research
Discussion Papers
35 • 2009
Suomen Pankki
Bank of Finland
PO Box 160
FI-00101 HELSINKI
Finland
+358 10 8311
E-mail:
Bank of Finland Research
Discussion Papers
35
•
2009
Yuliya Demyanyk* – Iftekhar Hasan**
Financial crises and bank
failures: a review of prediction
methods
The views expressed in this paper are those of the authors and
do not necessarily reflect the views of the Bank of Finland.
* Federal Reserve Bank of Cleveland.
Email:
** Rensselaer Polytechnic Institute and Bank of Finland.
Email: Corresponding author.
We thank Kent Cherny for excellent comments and Qiang Wu
for research assistance.
ISBN 978-952-462-564-7
ISSN 0785-3572
(print)
ISBN 978-952-462-565-4
ISSN 1456-6184
(online)
Helsinki 2009
3
Financial crises and bank failures:
a review of prediction methods
Bank of Finland Research
Discussion Papers 35/2009
Yuliya Demyanyk – Iftekhar Hasan
Monetary Policy and Research Department
Abstract
In this article we provide a summary of empirical results obtained in several
economics and operations research papers that attempt to explain, predict, or
suggest remedies for financial crises or banking defaults, as well as outlines of the
methodologies used. We analyze financial and economic circumstances associated
with the US subprime mortgage crisis and the global financial turmoil that has led
to severe crises in many countries. The intent of the article is to promote future
empirical research that might help to prevent bank failures and financial crises.
Keywords: financial crises, banking failures, operations research, early warning
methods, leading indicators, subprime markets
JEL classification numbers: C44, C45, C53, G01, G21
4
Rahoituskriisien ja pankkikonkurssien
ennustusmenetelmien arviointia
Suomen Pankin keskustelualoitteita 35/2009
Yuliya Demyanyk – Iftekhar Hasan
Rahapolitiikka- ja tutkimusosasto
Tiivistelmä
Tässä työssä arvioidaan julkaistuja taloustieteen ja operaatiotutkimuksen empiiri-
siä selvityksiä, joissa pyritään selittämään syitä rahoituskriiseihin ja pankki-
konkursseihin, ennustamaan pankki- ja rahoituskriisejä tai tarkastelemaan
politiikkavaihtoehtoja, joilla näitä kriisejä hallitaan. Tässä tutkimuksessa tehdään
myös yhteenveto rahoitus- ja pankkikriisien empiirisissä tutkimuksissa käytetyistä
menetelmistä. Työssä tarkastellaan lisäksi Yhdysvaltain asuntoluottojärjestelmän
kriisiin ja globaaliin rahoitusmarkkinoiden myllerrykseen liittyviä rahoitusjärjes-
telmän ja talouden piirteitä, jotka johtivat vakavaan kriisiin monessa maassa.
Tämän tutkimuksen keskeinen tarkoitus on edistää tulevaisuudessa tehtävää
empiiristä tutkimusta, jonka avulla rahoitus- ja pankkikriisien syntyä voitaisiin
estää.
Avainsanat: rahoituskriisit, pankkikonkurssit, operaatiotutkimus, varhaisten häly-
tysten menetelmät, ennakoivat indikaattorit, subprime-asuntoluotot
JEL-luokittelu: C44, C45, C53, G01, G21
5
Contents
Abstract 3
Tiivistelmä (abstract in Finnish) 4
1 Introduction 7
2 Review of econometric analyses of the subprime crisis 9
2.1 Collapse of the US subprime mortgage market 9
2.2 The subprime crisis is not unique 11
2.3 Selected analyses of bank failure prediction 12
2.4 Remedies for financial crises 13
3 Review of operations research models 16
4 Concluding remarks 24
References 25
6
1 In troduction
This article reviews econometrics and operations researc h methods used in
the empirical literatu re to describe, p red ict, and remedy financial crises and
mortgage defaults. Suc h an interdisciplinary a pproac h is beneficial for future
research as many of t he methods u sed i n isolation are not capable of accurately
predicting financial crises and defaults of financial in stitution s.
Operations researc h is a co m p lex and interdisciplinary tool that combines
mat hematical modeling, statistics, and alg orith m s. This tool is often employe d
by mana gers a n d managerial scien tists. It i s b ased on t echniques that seek to
determine either o ptimal or near optimal solutions t o com plex problems and
situations.
Many analytical tec hniqu es used in operations research ha ve similarities
with functions of the human brain; they are called ‘in telligence tec hniques.’
For example, Neural Networks (NN) is the most wid ely used model among the
intelligence tec hniqu es.
1
NN models have dev eloped from the field of artificial
intelligen ce and brain m odeling. They ha ve mathem at ical an d algorithmic
elements t h at mi m ic t he bi ological neural ne tworks of the human nervous
system. The model uses n onlinear function appro x im ation tools that test
the relationship between independen t ( explanatory) and dependent (to be
explained) facto rs. The method co nsid ers a n interrelated grou p of artificial
neurons and processes informatio n a ssociated with them using a so-called
connectionist approac h, w here network units are connected b y a flow of
information. T he structure of the model changes based on external or in tern al
inform a tion that flo w s through the n et work during the learning phase.
Com pa red to statistical methods, N N have two advantages. T h e most
importantoftheseisthatthemodelsmakenoassumptionsaboutthestatistical
distribution or properties of the data, and therefore tend to be more useful
in practical situations ( as most financial d a ta d o not meet th e s tatistical
requirem ents o f certain statistical models). A noth er advantage of the NN
method is its reliance on n on linear a ppr oaches, so that on e c an be mo re
accurate whe n testing complex data pa tterns. The nonlinearity feature o f
NN models is important because one can argue t hat the relati on bet ween
explanatory factors and the likelihood of d efault is nonlinear (several statistical
meth odologies, howev e r, are a lso able to deal with nonlinear relationsh ips
between factors in th e data).
This paper is related to w ork of Demirguc-Kun t and D etragiache (2005)
who review two ear ly w ar ning methods — signals approac h and the multivariate
proba bility model — tha t ar e frequently used in emp irical resea rch analyzing
banking c rises. Bell a nd Pain (xxxx) review the u sefuln ess a n d a p plicab ility
of the leading indicator m odels used in the empirical research analyzing and
predicting financial c rises. The authors n ote that the models need to be
impr oved in order to be a m or e useful tool for policymakers and analysts.
In this review we show that statistical tec hniques are frequently
accom p an ied by in telligenc e tec h niqu es for better model performance in the
empirical literature aimin g to better predict and a na lyze defaults and crises.
1
Chen and Shih (2006) and Boyacioglu et al (2008).
7
In most of the cases review ed , models th at use operations resea rch tech niques
alone or in combination with statistical methods predict failures better than
statistical m odels alone . In fact, hy br id intelligence sys tem s , which combine
several ind ividu al t echniques, have recently become very popular.
The paper also provides an analysis of financial an d econom ic circu m sta nces
associated with the subprime mortgage crisis. Many researchers, policymakers,
journalists, and other individuals blame the subprime mortgage mark et and
its collapse f or t riggering t he g lobal crisis; man y al so wonder how suc h a
relatively sm all sub prim e market could cau se so mu ch trou ble around the
globe, especially in countries that d id not get inv olved w ith subprime len din g
or w ith investmen t in subprim e securities. We pr ovide some insights into this
phenomenon.
The subprime credit market in the United States largely consists of
subprime mortgages. The t erm ‘subprim é usually refers to a loan (mortgage,
auto, etc.) tha t is viewed as riskier than a regula r (prime) loan in the eyes of a
lender. It is riskier because the expected p rob ability of default f or these loans
is h igh er. There are several definitions of subprime available in the industry.
A subprime loan can be (i) originated to a borro wer with a low credit score
and/or history of delinquency or bankruptcy, and/or poor employm ent h istory;
(ii) originated by lenders specializing in high-cost loans and selling fewer loans
to government-sponsored enterprises (not all high-cost l oan s are s ub pr im e,
though ); (iii) part of subprime securities; a nd (iv) certain mortg ages (eg, 2 /2 8
or 3/27 ‘h ybrid’ mortgages) generally not a vailable in the prime m arket.
2
The subprim e securitized m ortgag e market in th e United States boomed
between 2001 and 2006 and began to collapse in 2007. To better p icture the
size of this market ($1.8 trillion of US subpr im e securitized mo rtgage debt
outstanding),
3
it is useful to compare it with the value of the entire mortgage
debt in the United States (app roximately $11.3 trillion)
4
and the value of
securitized mortgage d eb t ($ 6.8 trillion).
5
In other words, as of the second
quarter of 2008, the subp rim e securitized market was roughly one-third of
the total securitized ma rket in the United States, or approximately 16 per
cent of the en tire U S m ortg ag e debt. Before the crisis, it was believ e d that a
mark et of suc h small size ( relatively to the US total mortgage market) could
not cause significan t problems outside the subprime sphere e ven i f it were
to cr ash completely. However, we no w see a sev ere ongoing crisis — a crisis
that has affected th e real economies of many countries in the world, causing
recessions, b ankin g and financial crises, a nd a global credit crunch.
The large effect of the relatively small subprime com ponen t of the mortgage
mar ket and its collapse was most likely due to the complexit y of the mar ket for
the securities that wer e created b ased on subprime m ortgages. The securities
were c rea ted by pooling individua l sub prime m ortga ges t oge ther; in addition,
2
See Demyan y k and VanHemert (2008) and Demyanyk (2008) for a more detailed
description and discussion.
3
As the total value of subprime securities issued between 2000 and 2007, calculated by
Inside Mortgage Finance, 2008.
4
Total value of mortgages outstanding in 2Q 2008. Source: Inside Mortgage Finance,
2008
5
Total value of mortgage securities outstanding in 2Q 2008. Source: Inside Mortgage
Finance, 2008
8
the securities themselves w ere again repac kaged and tranc hed to create m o re
com plica ted financial instrumen ts.
The mortgage secur ities were a gain split in to various n ew tranch es,
repackaged, re-split and rep ackaged again many times over. Each stage of
the secur itization process introduced more levera ge for financial institutions
and made valuing the holdings of th ose financial instruments more difficult.
All this ultimately resulted in uncertainly about the solven cy of a number
of large financial firms as over time the m arket value of the securities was
heavily discounted in response to trem o rs in th e housing market itself. Also,
the securities were larg ely traded in terna tionally, whic h led to spill-overs of the
US subprime mortgag e crisis and its consequences across th e country borders.
There are two s ections i n t his p aper. Section 1 summ arizes em pirical
meth odologies and findings of stud ies that apply econ ometric t echniques. In
this section, we outline sever al a n alyses of the US subprime mark et and its
collapse. We sho w that the c risis, even though sign ificant and devastating for
man y, w as not unique in the history of the United States or for other countries
aroun d the world. We review the a na lyses of bank fa ilure and su ggested
remedies for financial crises in the literature. Section 3 summ arizes em p irical
meth odologies used in Operations R esea rch studies a na lyzing and predictin g
bank failures. Section 4 concludes.
2 Review of econom etric analyses of the subprime
crisis
In this section we analyze the collapse of the subprime m o rtgage m arket in the
United States and outline factors associated w ith it.
2.1 Collapse of the US subprime mortgage market
The first signs of the subprime mortgage m arket collapse in the U nited States
w ere v ery high (and unusual ev en for subprime market) delinquency and
foreclosure rates for mortgages originated in 2006 and 2007. High rates
of foreclosures, declining home values, borrowe rs’ impaired credit histories,
destabilized neighborhoods, numerous vacant an d a band oned pro perties, the
absence of mechanisms pro vidin g entry into and exit out of the distressed
mor tgage market (uncertainty froze the market; a limited number of
home sales/purc hases occurred), and overall econom ic slowd own cr eated a
self-sustaining loop, escape from w hich was beyond the capacity of mark et
forces to find.
Demyanyk and Van Hemert (2008) analyzed the s ubp rime crisis em pirically,
utilizing a d ur atio n statistical model t h at allo w s estimating the s o -calle d
survival time of m o rtgage loan s, ie, ho w long a loa n is expected to be
current before the very first delinquency (m issed paym ent) or defau lt occurs,
conditional on never having been d elinquent o r in default before. The model
9
also allows con tro lling fo r vario u s individu a l loan an d borrower cha rac teristics,
as w e ll as macroeconomic circumstances. According to t he estimated results,
credit score, the cumulativ e loan-to-value ratio, the mor tgage rate, and
the house pri ce appreciation hav e the largest (in absolute terms) marginal
effects and are the most important for explaining cross-sectional differences in
subprime loan performance. H owever, according to the same estima ted m odel,
the crisis in the subprim e m ortgag e m arket did not occur because housing
prices in the United States started declin ing , as many have conjectured . The
crisis had been brew ing for at least six consecutiv e years before signs of it
became visible.
The quality of subprim e mortgages h ad been deterior atin g monotonically
ev ery year since at least 2001; this pattern was m asked, however, by house
price appreciation. In other w or ds, the quality of loans did n ot suddenly
become muc h worse j ust before th e defaults occurred — the qu ality wa s poor
and w orsen ing eve ry y ear. We were able to observ e this inferior qualit y only
when the housing market started slo wing down — when bad loans could not
hide behind high house a ppreciation, and when bad loans could no longer be
refinanced.
Demy anyk and Van Hemert also sho w that the abov e-mentioned
mon otonic deterio ration o f subp rime mortgages was a (subprim e) m a rket-wide
phenomenon. They split their sample o f a ll subprime mortgages in to t he
follo w ing subsamples: fixed-rate, adjustable-rate ( hybrid), purchase-money,
cash-out refinancing, mortgages with full documen tation, and mortgages with
lo w or no documentation. For eac h of the subsamp les, deterioration of the
mar ket is observable. Therefor e, one cannot blame the crisis on a ny single
cause, such as a particularly bad loan type or irresponsib le lending — th ere
were many causes.
Demyanyk (2008) empirically sho wed that subprime mortgages were, in
fact, a temporary phenomenon, ie, borrowers who took subprime loans seemed
to have used mortgages as t em porary bridge financing, either in order t o
speculate on house p rices or to im pr ove their credit history. On average,
subprime mortgages of any vintage did not l ast longer than three years:
approximately 80 percent of borro wers either prepaid (refinanced or sold their
homes) or defaulted on the mortgage contracts within three y ears of m ortgage
originatio n.
Sev e ral researchers ha ve foun d that securitization opened the d oor to
increased subpr ime lend ing between 2001 a nd 2 00 6, whic h i n t urn led to
reduced incentives for banks to screen borrowers and increased sub sequent
defaults. For example, Keys et al (2008) inv estigate the relationship
between s ecur itization and screening standards in the context of subprime
mor tgage-backed securities. T h eories of financial intermediation suggest that
securitizat ion — t h e act of conv e rting illiquid loan s in to liquid secur it ie s —
could reduce t he incen tiv es of financial intermediaries to screen borro wers.
Em pirically, the authors ‘exploit a specific rule of th umb [credit score 620]
in the len ding market to generate an exo g enou s variation in the ease of
securitization and compare the composition and performance of lenders’
portfolios around the ad-hoc threshold’. T hey find that ‘the portfolio that
is more likely to be securitized d efau lts b y around 10—25% m ore than a
10
similar risk profile group with a lower probability of sec ur it izatio n’, even after
analyzing for ‘selectio n on th e p art o f borrowers, lenders, o r investors’. Their
results suggest that securitization d oes ad versely affect the screenin g incen tives
of lenders.
Mian and S ufi (2008) sh ow th at securitization is associated w ith increased
subprime lending and subsequen t defaults. M ore specifically, the au thors show
that geographical areas (in this case, zip codes) with m ore borro wers who had
credit application rejections a decade before the crisis (in 1996) had more
mortgage defaults in 2006 and 2007. Mian and Sufi also find that ‘prior to the
default crisis, these subprime zip codes (had experienced) an unprecedented
relative gro w th in m o rtga ge cred it’. The expansio n in mortgag e cred it in th ese
neighborhoods w a s combined with de clin ing inc om e growth (relative to oth er
areas) and an increase in securitization of subprim e mortgages.
Taylor (2008) blames ‘too easy’ monetary policy decisions, and the resulting
lo w interest rates bet ween 200 2 and 2004 for causing the monetary excess,
which in turn led to the housing boom an d its subsequent collapse. He
compa res th e h o using market boom that could have r esulted in the US economy
if mon etary policy had been conducted according to t he historically followed
Taylor rule — a rule that su ggested m u ch higher interest rates for the period
— with the a ctu al housin g boom . Based o n th e com pa rison, there would have
been almost no housing boom w ith the higher rates. No boom would ha ve
meant no subsequen t bust. The author dismisses the popular h y pothesis of an
excess of world savin gs — a ‘savings g lut’ — t ha t m any use to justify t he low
in terest rates i n th e econ omy, and shows that th ere was, in fa ct, a global savings
shortage, not an excess. Also, com pa ring monetary policy i n other countries
with that in the United Sta tes, Taylor notices that the housing booms were
largest in coun tries w h ere deviations of the actual interest rates from tho se
suggested by the Taylor rule were the largest.
There is a large literature that analyzes mortgage defaults. The analysis is
importan t for understanding the subprime mortgage crisis, which was triggered
by a massiv e w ave of mortgage delinquencies and f oreclosu res. Importan t
contributions to this l iterature include Deng (1997), Ambrose a nd Capone
(2000), Deng et al (2000), C alhoun a nd D eng (2002), Pennington-Cross
(2003), Deng et al (2005), Clapp et al (2006), and Pennington -Cross and
Chomsisengphet (2007).
2.2 The subprime crisis is not unique
Demyanyk and Van H emert (2008) show evidence that the subprim e mortgage
crisis in the United States seem s, in ma ny respects, to have followed the
classic lending boom -and-bust cycle documen ted by Dell’Ariccia et al (2008).
First, a sizeable boom occurred in the subprime mortgage market. Depending
on the definition of ‘subprim e’, the market grew from three to seven tim es
larger bet ween 1998 and 2005 ( see M ayer and Pence (2008) for measures
of the s ize a nd t he i ncrease o f the s ubprime mort gage market based on
US Departmen t of Housing and Urban Dev elopment and LoanPerformance
definitions). Second, a definitiv e collapse of the market occurred i n 2007,
11
which was reflected in high delinquency, foreclosure, an d default rates. A
year later, the s ub pr im e mortgage crisis spilled o ver into other credit markets,
creating a much larger financial crisis and global credit crunc h. Third, the
periods preceding the collapse were associated with loosening of underwriting
standards, deteriorating loan q uality, and increasin g loan riskiness that were
not bac ked up by an increasing price of this extra risk. In fact, the
subprime-prime spread was actually declining o ver the boom period.
Increasing riskiness in th e m a rket, together with the decreasin g price o f
this risk, leads to an un susta inable situation, which in turn leads to a market
collapse. The subprime episode fits into this boom-bust framework easily.
Moreover, not only have D e myanyk and Van H emert (2008) shown th at the
crisis followed a classic p ath kno w n to policymakers and researchers in several
countries but they ha ve also sho w n that analysts could ha ve foreseen the crisis
as early as late 2005. It is not clear, though, whether the crisis could hav e
been prevented at that point. Com paring the fin d in gs o f D ell’Ariccia e t al
(2008) and D emyanyk and Van Hemert (2008), it appears the U nited States
(in 2007); Argen tina (in 1980); Chile (in 1 982); Sw eden, Norway, and Finland
in (1992); Mexico (in 1994); and T hailand, Indonesia, and Korea (in 1997) all
experienced the culmination of sim ilar (lending) boom -bust scenarios, but in
very different economic circumstances.
Reinhart and Rogoff (2008), who analyzed macro indicators in the United
States preceding the financial crisis of 2008 and 18 other post-Wo rld War II
banking crises in industrial countries, also found striking sim ilarities a m o ng a ll
of them. In par ticular, the c ou ntries experiencing the crises seem to share a
simila rity in the sign ificant increases in ho using p rices before the fina ncial crises
commenced. Ev en more striking is evidence that the United States had a much
higher gro w th rate in its house prices than the s o-called Big F ive countries in
their crises (Spain in 1977, Norway in 1987, Finland in 1991, S weden in 1991,
and Japan in 1992). In comparing the real rates of growth in equit y mark et
price indexes, the authors again find that pre-cr isis similarities are eviden t
amo ng all th e crisis co untries. Also, in comparing the curren t account as a
percen tage of gross domestic product (GD P), not only are there similarities
between coun tries, b ut the United States had la rger deficits than those of th e
other countries before their crises, reaching more than six percent of G D P.
The a uthors no ted, how ever, that there are great un certainty associated with
the still ong oin g 2008—200 9 crisis in the Un ited States; therefore, it is not
possible to project the path o f crisis resolution b ased on the experiences of
other countries.
2.3 Selected analyses of bank failure prediction
Dem irguc-K unt and Detragiache (1998) study the determ inants of the
proba bility of a b an king crisis aro un d the world i n 1980—1994 using a
multivariate Logit model. They find that bank c rises are m ore lik ely in
coun tries with low GDP gro wth, high real interest rates, high inflation rates,
and explicit deposit insurance system. Countries that are more suscep tible
12
to balance of p ayments cris es also have a higher probability of e x periencing
banking crises.
Demirguc-Kun t and Detragiache (2002) specifically investigate the relation
between the e xplicit deposit insuran ce and stability in banking sec tor a c ross
countries. The authors confirm and strengthen the findings of Dem irguc-Kunt
and D etra gia che (1 99 8) that ex plic it deposit insuran ce can h arm bank stab ility.
This happens because banks may be encouraged by the insurance to finance
high-riskandhigh-returnprojects,whichinturncanleadtomorebank
losses and failures. The authors find that deposit insurance has a more
negativ e impact on the stability of banks in c oun tries where the institutional
en viron ment is weak , where the coverage offered to d epositors is more intensiv e,
and where the sc heme is run by the government rather than by the private
sector.
Dem irguc-K unt et al (2006) examine what happens to th e structure of th e
banking secto r following a bank crisis. The authors find that individuals and
companies lea ve w eak er banks and deposit their funds in stronger banks; at
the same tim e, the aggregate bank deposits relative to countries’ GD P do n ot
significantly decline. Total aggregate credit declines in countries after banking
crises, and b anks tend to reallocate th eir asset portfolios away from loan s and
impr ove their cost efficiency.
Wheeloc k and Wilson (2000) analyze what f actors predict bank failure
in the U nited States, particular ly. The authors u se com peting-risks hazard
models with time-varying co variates. They find that banks with lower
capitalization, higher ratios of loans to assets, poor quality loan portfolios
and lo wer earnings have higher risk of failure. Banks located in states where
bran ching is perm itted are less lik ely to fail. This may indicate that an ability
to create a bra nch net work, a nd an associated ability to div er sify, red uces
banks’ suscep tibility to failure. Furthe r, the m ore efficiently a bank operates,
the less likely the bank is to fail.
Berger and DeYoung (1997) analyze instances w h en US commercial banks
face increases in the proportion of nonperforming loans and reductions in cost
efficiency between 1985 and 1994. The authors find that these instan ces a re
in terrelated and G r anger -cause each other.
2.4 Rem edies for financial crises
Caprio et al (2008) indicate that recent financia l crises often occur because o f
booms in macroeconom ic sectors; the crises are revealed follo w ing ‘iden tifiable
shocks’ t h at e nd the booms. Importantly, t he u n der ly in g distortions
of economic markets build up for a long time before the crisis is identified
(Demya nyk and Van Hemert (2008 ) identify suc h a p rocess for the US sub pr ime
mor tgage crisis). Cap rio et al (2008) discuss the ro le o f financial deregulation
in pr edictin g crises and iden tify a mec h anism for interaction between the
governm ents and regulated institutions. The authors propose a series o f
reform s that could prevent future crises, such as lending reform, rating agency
reform and securitization reform. M o st importantly, a cco rding to the authors,
13
regulation and supervision should be re-strengthened to prevent such crises in
the future.
In his researc h, Hunter (2008) attempts to understand the causes of, and
solutions for, the financial crises. He defines the beginning o f th e recent crisis
in the United States to be the poin t in time when inter-bank lending stopped
in the Feder al Fu n ds M arket. Following this definition, the US crisis began
around October 8, 2008, when the Feder al Funds Rate hit a high of seven
percen t during in traday trading. According to Hunter, the primary reason
for trading h alt was that ban ks were unsure about the exposure of their
counterparties to MBS ris k: ‘If a b a nk ha s a large share of its asset portfolio
devoted to MB S , then selling MB S to g et operatin g cash is infeasible when
the price of M BS has declined significantly. Banks in this situa tion are on th e
brink of insolvency and may indeed have difficulty r epaying loan s they receive
through the Federal Funds Market’. The autho r suggests several solutions to
the crisis. Among them, he emphasizes the importance of transparency in the
operation of and analysis by MBS insurers and bond rating agencies. He also
stresses the dev elopm ent of a systematic way of evaluating coun terpa rty risk
within the financial system . In the short term,hesuggeststhattheFedcould
encourage more borro wing through the Discount Windo w.
Diamon d and Rajan (2009) also analyze th e causes of the recent US
financial crisis and provide some remedies for it. According to the authors,
the fir st reason for th e crisis was a misallocation of in vestmen t, which occurred
because of the mismatch between th e soft information loan o fficers based credit
decisions on and the ha rd infor m ation (like c redit sco res) th e securities trading
agencies used to rate m ortgage bonds. This was not a big problem as long as
house p rices kept rising. How ever, when h ou se p rices began to d eclin e a nd
defaults started increasing, the va luation of securities based on loans became
a b ig problem (as th e ratings may not tru ly capture the risk of loan s within
those securities). The second reaso n for the crisis was excessive holdings of
these securities by banks, w hic h is associated with an increased default r isk.
To solve or mitigate the crisis, Diamo nd and Rajan first suggest that the
authorities can offer to buy illiqu id assets thro ugh au ction s and h ou se them
in a federal entity. Th e govern m ent should also ensure the stabilit y of the
fin an cial sy stem b y r ec ap italizing t h ose banks t h at h av e a realistic possibility
of su rvival, and me rging or closing those that do not.
Brunnermeier(2008)triestoexplaintheeconomicmechanismsthatcaused
the housing bubble and the turmoil in the financial markets. According to the
author, there are three factors that led to the housing expansion. The first is
a low interest-rate and mor tgage-r ate environment for a relativ ely lon g tim e
in the United States, likely resultin g from large capital in flows from abroad
(especially from Asian coun tries) and accompanied by the lax in t erest rate
policy o f the Federal R eserv e. Second, the Federal Res erv e did not m ove
to prevent the buildup of the housing bubble, most likely because it feared a
possible deflationary period followin g the bur sting of the In tern et stock bub ble.
Third, and most importan tly, the US banking system had been transformed
from a traditional relationship banking model, in which banks issue loans and
hold them un til they are repaid, to an ‘originate-to-distribute’ banking m odel,
in which loans are pooled, tranched and then sold via securitization. This
14
transform ation can red uce banks’ monito ring incentives and increase their
possibility of if th ey hold a large amou nt of such securities withou t f ully
understand ing the associated credit r isk.
Brunnermeier further i den tifies several econ om ic mec hanisms throug h
which the mortgage crisis w as am plified into a broader financial crisis. All
of the mec han isms begin with the drop in house prices, wh ich eroded the
capital of fin an cia l in stitu tion s. At the s a m e time, lenders t ig htened lend ing
standards and margin s, whic h cau sed fire sales, further pushing down prices
and tighten ing credit supplies. When ba nks became concerned about their
abilit y to access capital markets, they began to hoard funds. Consequ ently,
with the dro p in b alance sheet capital and difficulties in accessin g additional
funding, b anks that held large am o unts of MB S failed ( eg, Bear Stearns,
Lehman Brothers, and Washington Mutual), causing a sudden shock to the
financial mark et.
Sev eral researc hers conclude that the ongoing crisis does not reflect a failure
of free mark ets, but a rather reaction of market participants to d istorted
incentives (Demirgu c-K unt and Serv en, 2009). Dem irguc-Kunt and Serv en
argue that the ‘sacred cows’ of financial and macro policies are not ‘dead’
because of the crisis. Man ag ing a s ys tem ic panic requires policy decisions t o be
made in differen t stages: the im m ediate conta inment stage and a longer-term
resolution accompan ied by structural reform s. P olicies emplo yed to r eestablish
confidence in the short ter m , such as pr oviding blanket gua rantees or
go vernm ent buying large stakes in the financial sector, are fraught with moral
hazard problem s in the long term and might be in terp reted as perman ent
deviation s from well-e stablishe d polic y positions by the market. Th e long-term
fin an cial sector policies should align private incentives with public inte rest
without taxing or subsidizing private risk-taking (Demirg uc-K u nt and Serv en,
2009). Although well designed prudential regulations c an not co m p letely
eliminate the risk of crises, they can m ake crises less frequent. Ho wever,
balanc ing the short- and long -term policies becomes com p lex in the framework
of a n integra ted and glob alized financial system.
Analyzing the Asian financial crisis, Johnson et a l (2000) p resent eviden ce
that coun try-level corporate governance practices and institution s, suc h as the
legal environment, have an important effect on currency depreciations and
stock m a rket declines durin g financial crisis periods. T he authors borro w from
the corporate go vernance literature (see Shleifer and Wishn y, 1997) theoretical
arguments that corporate gov ernance is an effective m echanism to minimize
agency con flicts between i nside manage rs and outside stakeh old ers. The
authors empirically show that corporate governance — measured as efficiency
of the legal system, corruption and rule of law — explains more of the variation
in exchange rates a nd stock mark et performance t han do macroeconomic
variables during the Asian cris is.
Angkina nd (2009) reviews methods used to evaluate the output loss from
fin an cial c rises. The autho r argues that a n empirical m ethodology estimating
the total output loss per crisis from the deviation of actual output from the
potential output trend — the gap approach — estimates the econom ic costs
of crises better than a methodology th at estimates a dummy variable to capture
15
the crisis — the dumm y variable approach — because the output costs of different
crisis episodes vary significantly.
A book by Barth et al (2009) provides a descriptiv e analysis explaining how
the crisis emerged in the United States and what actions the US governmen t
is taking to remedy the economic and credit mark et contractions. A valuable
contribution of th e study is a list of U S bailout allocations and ob lig atio ns.
This list is a lso fr equ ently updated and reported on th e Milken Institute web
page.
6
3 R eview of o peratio ns researc h m odels
In this section, w e describe selected operation s research models that are
frequently used in the em pirical literature to predict defaults or failures of
banks and that could be used to predict defaults of loans or non-financial
institution s.
Predicting the defau lt risk for banks, loans and securities is a classic, yet
timely issue. Since the work of Altman ( 1968), who s uggested using the
so-called ‘Z score’ to predict firms’ default risk, hundreds of research articles
have studied this issue (for reference, see t wo review articles: K u ma r and Ravi
(2007) and Fethi and Pasiouras (2009)).
Sev e ral studies have shown that intelligence modeling tec h niques used in
operations research can be applied for predicting the bank failures and crises.
For example, Celik and Karatepe (2007) find that artificial neural network
models can be used to forecast t h e rates of n on-performing loans relative
to total loans, ca pital relativ e to assets, profit relativ e to assets, and equity
relative to assets. In another example, Alam et al (2000) demonstrate t hat
fuzzy clustering and self-organizing neural net works provide classification tools
for id entifying potentially failing ban k s.
Most cen tral banks h ave emplo yed va rious Early Warning Systems (EWS)
to monitor the risk of banks for y ea rs. Ho wev er, th e repeated occurrence
of ba nkin g crises during the past two decades — such as the Asian crisis, the
Russia n bank crisis, and the Brazilian bank crisis — indicates that s afegua rdin g
the banking system is no easy task. According t o the Federal D eposit Insurance
Corporation Impro v ement A ct of 1991, regulators in the United States m ust
conduct on-site examinations of bank risk ev ery 12—18 months. Regulators use
a rating system (the CAM E L S rating) to indicate the safety and soundness
of banks. CAME LS ratings include six parts: capital adequacy, asset qualit y,
man agem ent expertise, earnings strength, liquidity and s ensitivity to market
risk.
Da vis and Karim (2008a) evaluate statistical and intelligence techniques in
their analysis of the banking crises. Specifically, they comp are the logistic
regression (Logit) and the S ign al Extraction EWS methods.
7
They find
6
h ttp://www.milkeninstitute.org/publications/publications.taf?function=detail&ID
=38801185&cat=resrep.
7
The term ‘signal extraction’ refers to a statistical tool that allows for i solation of a
pattern of the data — the signal — out of noisy or raw time-series data.
16
that the ch oice of estimation models makes a difference in terms of ind icator
performance and crisis pr ediction . Specifically, Logit model perform s better
as a global EW S and Signal Extrac tion is p refer ab le as a country-specific
EW S . Davis and Karim (2008b ) test wheth er E WS based on the Lo git
and b inomial tree a pp roaches (this techniqu e is d esc ribed below) could have
helped predicting current subprime crisis in the US and UK. Using twelv e
macroeconom ic, financial and institutio na l variables, they find that among
global E WS for th e US and UK, t he Logit performs t he best. How ever, this
model as many o th ers has only a sma ll ability to pred ict the crises.
West (1985) uses the Logit model, along w ith factor analysis, to measure
and describe banks’ fin ancial and operating characteristics. D ata was tak en
from Call and Income Reports, as well a s Examination Reports for 1,900
comm er cial banks in several states of the US According to the analysis, the
factors identified by the Lo git model as importan t descriptive variables for
the banks’ o perations are sim ilar to those u sed for CAME LS ratings. He
demonstrates that his combined method of factor analysis and Logit estimation
is useful when evaluating banks’ operating cond itions.
Among the statistical techniques analyzing and predicting bank failures,
Discrim in ant Analysis (DA) was the leading technique for many ye ars (eg,
Karels and P rakash (1987), Haslem et al (1992)). There are three subcategories
of DA: Linear, Multivaria te, and Q u adra tic. One drawback of DA is tha t it
requires a normal distribution of regressors.
8
Wh en regressors are not norma lly
distributed, maximum likelih ood methods, suc h as Lo git, can be used.
9
DA is
a tool for analyzing cross-section a l data . If one needs to a nalyze time series
data on bank firm, or loan defaults, hazard or duration analysis models can
be used instead of DA models.
10
Can bas et al (2005) propose an Integrated Early Warning System (IEW S)
that combines D A, Logit, P robit, and P rincipal Component Analysis (PCA),
which can help predict ban k failure. First, th ey use PCA to detect three
financial componen ts that significantly explain the changes in the financial
condition of banks. They then employed DA, Logit and Probit regressio n
models. By combining all these together, they construct an IEW S. The authors
use the data for 40 privately o wned Turkish commercial banks to test t he
predictive power of t he IEW S, concluding that the IEWS has more predictive
ability than the other models used in the literature.
Am ong in telligence tec hniques, Neural Networks (N N ) is the most widely
used. The NN model ha ve developed out of the fields of artificial in te llig en ce
and b rain modeling, and conta ins m a thematical an d alg orith m ic elements that
mimic t h e b iological neural networks of the hum an nervous system . The
meth od consid ers an i nterrelated g rou p of artificial n euro ns and processes
inform ation associated with them using a so-called connectionist approach,
8
Martin (1977) is an early study that uses both Logit and DA statistical methods to
predict bank failures in the period from 1975 to 1976, based on data obtained from t he
Federal Reserve System. The author finds that the two models have similar classifications
in terms of identifying failures/non-failures of banks.
9
As in, for example, Martin (1977), Ohlson (1980), Kolari et al (2002) and Demyanyk
(2008).
10
See Cole and Gunther (1995), Lane et al (1986), Molina (2002), among many others.
17
where network u nits are con nected b y a flo w of information. The structure
of NN models chan ges based upon external or internal information th at flows
through the network during the learning phase and uses nonlinear function
approximation t ools to test the relation ship bet ween explan ator y facto rs.
Bo yacioglu et al (2008) com pare various N N, S upport Vector M achine
(SV M ) and m ultivar ia te statistical methods to t he bank failure prediction
problem in Tur key. They use similar financial ratios as those used in CAM ELS
ratings. In the category o f NN, four different arch itectures are emplo yed,
nam ely MLP, C L , SOM and LVQ (the d e tails of these arch itectures are
not described in th is review). T he multivariate statistical methods tested
are multivariate discriminant analysis, K-m eans cluster analysis, a nd Logit
regression analysis. Accord ing to the com p ar ison, MLP and LVQ ca n be
considered the most successful models in predicting the financial failure of
banks in the sample.
The Back-Propagation Neural Networks (BPNN) model is a m ultilay er N N
model. The first layer is constructed from input units, the middle la y er consists
of hidden units, and the last layer consists of output units. Eac h up per layer
receiv es inputs from units of a lower lev el and transmits output to units of the
la yer above it. The important featu re of B P NN is that the errors generated
b y units of a hidden la y er are calculated b y back-propagating the errors of
the output sent b y lev e ls of its correspondin g la yer. BPN N overcomes the
classification restrictio n o f a sin gle-layer network , and it is one of th e m ost
comm only used methods f or classifica tion and prediction problems. Many
studies com p are the classification and pred iction accuracy between BPN N and
other m ethods and find that, in most cases, B PN N outperform s other models.
For exam p le, Tam (1991) uses a BPNN model to predict bank failu res in
a sam ple o f Texas ban ks o ne year and tw o years prior to their failures. The
input v ariables he uses are bases on the CAM ELS criteria. He finds that BPNN
outperforms all oth er m ethods, such a s DA, L ogit, and K -near est neig hbor (this
meth od is described below) in terms of their pred ictive accur acy. Similarly,
sev e ral other studies, briefly described below, find that BPNN offers a better
prediction or a better classification accuracy than other methods.
Ravi and Pram odh (2008) propose a Principal Com ponent N eural Network
(PCNN) architecture for bankruptcy predic tion in commercial banks. In this
architectu re, the hidden layer is comp letely replaced by what is referred to as
a ‘princip al com ponen t layer’. This layer consists of a few selected components
that perform the function of hidden nodes. The authors tested the framew ork
on a d ata from Spanish and Tu rkish banks. According to the estimated results,
hybrid models that combine PCNN and se veral othe r models predict b an kin g
bankrup tcy outperform other c lassifiers used in the litera tu re.
Tam a nd Kiang (1992) compare the po wer of linear discriminant analysis
(LDA), Logit, K-Nearest Neighbor (described below ), Intera ctive Dic hotom izer
3 ( ID3), feedforw ard NN and BPNN on bank failure prediction problems.
They find that BPNN outperforms the other techniques for a one-y ear-prior
training sample, while DA outperforms the oth ers for a tw o-years-prior tra ining
sample. Ho wev er, f o r h old out s amples, BPN N outperforms t he others in
both th e one-y ear -prior and the two-years-prior samples. In the jac kknife
meth od, BPN N a lso o utperforms o thers in both the one-year-prior and the
18
t wo-y ears-p rior h oldou t sam ples. In all, they conclude that NN ou tperforms
the DA method.
Bell (1997) compares Logit and BPNN models in predicting bank failures.
In his study, he uses 28 candidates for predictor variables. The arch itecture of
BPNN has t welv e input nodes, six hidden n odes and one output node. He finds
that neither the Logit nor the BPNN model dominates the other in ter ms of
predictive abilit y. However, BPN N is found to be better for complex d ecisio n
processes.
Swicegood and Clark (2001) compare DA, B PNN and hum an judgm en t in
predicting bank failures. The authors use data from bank Call Reports. They
fin d that BPN N outperforms other models in identifying underperforming
banks.
Olmeda an d Fernandez (1997) com par e the accuracy of bankrup tcy
prediction methods that include classifiers in a stand-alone model with those in
a hybrid s ystem , whic h i ntegrates sev eral classifiers. They propose a framework
for form ulatin g the optimal mixture of the techno logies as an optimization
problem and solve it using a genetic algorithm . Using data from the Span ish
banking system, they find BPNN perform s the best, Logit th e second best,
and multivariate adapt ive splines (MAR S) , C4.5 (n o t described in this review)
and DA follo w in that order. The authors then combine models using a v oting
sc hem e and a compensation aggregation method. They find that the prediction
rates produced b y the com bined models are higher than those produced by the
stand-alone model.
The Trait Recognition technique dev elops a model from different segm ents
of the distribution of each variable and the interactions of these segments
with on e or mor e other variables’ segm ented distributions. It uses two sets
of discrimina tors, the ‘safe traits’ and the ‘un safe tr aits’, kno w n a s features.
These features can then be used to predict bank failures by voting each
bank and classify in g it a s ‘failed’ or ‘non-failed’. Trait recogn ition is a
nonparametric approach that does not impose any distributional assumptions
on the testing variables alread y contained within the data. T he advan tage
of the trait recognition approach is that it exploits information about the
complex interrelations of variables. The po wer of this approach depends on
theadequateselectionofcutpointsforeachofthevariables,sothatallfailed
banks can be located below some threshold and all non-failed banks above it.
Kolari e t al (2002) develop an EW S based on Logit a nd the Trait
Recognition method fo r large U S banks. The Logit model correctly classifies
o ver 96% of the banks one y ear prior to failure and 95% of the banks tw o years
prior to failure. For the Trait Recog nitio n model, half of the original sample is
used. They find th at with data classification both one year and two y ea rs p rior
to failure, the accuracy of the Trait R eco gnition model is 100% . T herefore,
they conclu de th at the Trait Re cogn ition m odel outperforms the Logit model
in term s of type-I and type-II errors.
Lanine and Vander Vennet (2006) e m ploy a Logit model and a Tr ait
Recognition app roa ch to predict failures among Russian com m ercial banks.
The authors test the predictive power of the t wo models based on their
prediction accuracy using holdout samples. Although both models perform
better than the benchmark, the Trait Recognition approach outperforms Logit
19
in both the original and the holdout samples. For the predictable variables,
they find that expected liquidity plays an im portan t role in bank failure
prediction, as well as asset q uality a nd capital adequacy.
TheSupportVectorMachine(SVM)techniqueisbasedontheStructural
Risk Minim ization (SRM) principle fro m com p utatio nal learning theory, which
w as introduced by Vapnik (1995). In the SVM method, input data is structured
as t wo sets of vectors in a m u lti-dim ensio na l space. T h e purpose is to m a ximize
the m argin between the two data sets. In order to calculate the margin, t wo
parallel h yperplanes need to be constructed, one on each side of the separating
h yperplane, which are forced against the tw o data sets. A good separation
can be a chiev ed by the hyperplane that has the largest distance f rom the
neighboring data poin ts of both classes; th e lar ger the m a rgin, the better the
generalization error of the classifier. In sum , SVM uses a special linear model
and the optimal separating h yperplane to ac hieve the maxim um separation
between two classes. T he tra ining poin ts that are closest to the maximum
mar g in hyperplane are called support v e cto rs. Such models ar e utilized in
Vapnik (1995), Bo yacioglu et al (2008), Chen and Shih (2006) and H uang et
al (2004), among others.
The Decision Tree (DT) tec hniqu e, whic h comes from research on mac h ine
learning, uses a recursive partitio ning algorithm to institute rules on a given
data set. M ost decision tree a lgor ithm s are used for s o lving classification
prob lems. H owever, algorithm s lik e classi fication and regression trees (CART)
can a lso be used for solving prediction p roblem s. In this case, a binary d ecision
tree needs t o be developed thro ug h a set of IF-THEN rules. These rules can
be used to accurately classify cases (eg, banks). A num ber of algorithms
are used for building d ec isio n trees, including CHA ID (c hi-sq u ared automatic
interaction detection), C ART, C 4.5 and C5.0 . Fo r more inform atio n, see
Marais et al (1984) and Frydm an et al (1985)
The Rough Set tec hnique is a mathematical method for modeling
incom plete data based on a concept given by Pa w lak (198 2). It uses an
appro xim ation of the usually vague objective into a predefined c a tegories,
which then can be iteratively analyzed. See Greco et al (19 98) for details.
Case-Based Reasoning (CBR) is a method similar to the cognitiv e process
hum a ns follo w in so lvin g problems in tuitively. CBR can be rep resented by a
sc hematic cycle comprisin g four step s. The first step is to retrieve the most
similar cases. The second is to reuse the cases to attempt to solv e the problem.
The third is to revise th e proposed solution, if necessary. And the fo u rth is to
retain the new solution as a part of a new case. CB R methodology enables an
analyst t o p red ict failu re of a com pany based o n failures of other comp anies
that occurred in the past.
The Nearest Neighbor technique c lassifies an object in the class of its
nearest neighbor in the measurem ent space, using a certain distance m easu re
suc h as local metrics, global metrics, or M ahalan o bis or Euclidean distan ce.
The method h as a variet y of applications, ranging from analyzing settlemen t
and patterns in landscape, s pam classification,oranyotherdistributionof
objects and events. One can determine if objects or events are random ,
clustered, or distributed r egularly. T he K-nearest n eigh bor (K -NN) is a
modified Nearest Neighbor technique. I n this model, K is a positive, usually
20
small, integer. An object (for example, a b a nk) is assig ned to th e c lass
most commo n amongst its K nearest neighbors ( th e c la ss is either ‘failed ’ or
‘non-fa iled’).
Zhao et al (2009) compare the performance of several factors that are used
for predicting bank failures b ased on Logit, D T , NN, a nd K-NN models. The
authors find that a m odel choice is important in terms of explanatory power
of predictors.
The S o ft C ompu ting t echnique is a hybrid syste m combining intelligence
and sta tistical techniques. Specifically, it refers to a combination o f
computational tec hniques in order to model and analyze complex phenomena.
Com pa red to traditio nal ‘hard’ comp uting techniques — whic h use exact
computations and algorithms — soft computing is based on inexact
compu tation, tr ial-and-error reasoning, and subjective decision making. Such
compu tation builds on m ath em a tical formalization of the cognitiv e processes
similar to th ose of human minds. More informat ion is available in Back an d
Sere (1996), Jo and Han (1996), Tu ng et al (2004).
Data Envelop m ent Analysis ( D EA) is a non-param etric perform an ce
meth od used to m easu re the relative efficiencies of organizational or
decision-making units (DMUs). DEA applies linear programming to observing
inputs consumed and outputs produced by decision-making u nits (suc h as
branches of a bank or departments of an institution). It constructs an efficien t
production frontier based o n best observed practices. Eac h DM U ’s efficiency
is then measured against this c omputed fron tier. The relativ e efficiency is
calculated by obt aining the ratio of the weighted sum o f a ll outputs and
the weighted sum of all inputs. The weights are selected to ac hieve Pareto
optimalit y for eac h DMU.
Luo (2003) uses the DEA model to evaluate profitab ility and marketability
efficiencies of large banks. In the model, the author a nalyzes banks’ revenue
and profit as the measured outputs of both efficiencies. He finds that
mar ketability inefficiency creates more problems for the an alyzed banks than
profitabilit y inefficiency. In an a pp lication to pred iction of b ank ing crises, the
findings suggest that ov erall technical efficiency of the profitability performan ce
is a ssociated with a lik elih ood o f bank fa ilur e.
Avkiran (2009) analyzes the profitefficiency o f commercial banks in the
United Arab Em irates by ap plying a standard D E A and a n etwor k DE A
(NDEA) tec hnique. The author men tions that the standard DEA does not
pro vide sufficient details to identify the specific sources of inefficiency; net work
DE A gives access to t his u n derlying diagnostic i n form atio n, becau se ea ch
division of an institution can be treated a s an independen t DM U u nder the
NDEA. Note that the efficiency measu res derived from stochastic DEA do not
accoun t for statistical noise; the impact of measurement error on efficiency
is generally overlook ed and it is not possible to con duct a form a l statistical
inference b y using stoc hastic DE A.
Kao and Liu (2004) formu late a DEA m odel of in terval data for use in
evaluating the performance of banks. Their study makes adv ance predictions
of the performance of 24 Taiw an banks based on uncertain financial data
(reported in ranges) and also presen ts the prediction of efficiency scores (again
in r an ges). Th ey find that the model-p redicted efficiency scores are similar to
21
the actual (calculated from the d ata) efficiency scores. They also show that the
poor performances of the t wo banks tak en over by the Financial Restru cturing
Fund of Taiw an could actually hav e been predicted in advance using their
meth od.
Tsionas and Papadakis (2009) pro vide a s tatistical framework t hat can
be used with stochastic DEA . In order to make inference on the efficiency
scores, t he authors use a Ba yesian approac h to the problem set up around
sim ulation techniques. They also test the new m ethods on the efficiency of
Gr eek banks, and find that the m ajority of t he Greek banks operate c lose to
mar ket best-practices.
Cielen et al (2004) com p are the performance of a DE A model, Minim ized
Sum of Deviat ions (MSD ), and a rule induction (C5.0) model in ba nkr up tcy
prediction. M SD is a com bination of linear programming (LP) and D A. Using
data from the National B ank of Belgium, they find that MSD, DEA and C5.0
obtain the co rrect classification rates o f failure for 78.9%, 86.4% and 85.5%
of ba nk s, respectively. They c on clude that DE A outperfor m ed the C5.0 and
MSD models in terms of accuracy.
Kosmidou and Z opounidis (2008) develop a bank failure prediction
model based on a m ulticriteria decision tec h niqu e called UT ilites Additives
DIScriminan ts (UTA DIS). The purpose of UTADIS method is to dev elop a
classification model throu gh an additive value fun ction. Based on the values
obtained from the additiv e value f unction, t he authors classify banks in to
multiple groups by compa ring t h em with s om e r eferen ce profiles (also called
cut-off poin ts). UTAD IS is well suited to the o rdinal classification problem s
and it is not sensitive to th e statistical proble m s because the additiv e utilit y
function is performed through m athematical linear programming tec hniques
instead of statistical methods. Using a sample of US bank s for the years
1993—2003, the authors use this tec hnique to differentiate US banks bet ween
failed and non-failed. The results show that UTA D IS is quite efficient
for the evaluation of bank failure as early as four y ears before it occurs.
The authors also compare UTAD IS with other traditional multivariate data
analysis techniques and find that UTADIS performs better, and could be used
efficiently for predicting bank failures.
The M u lticriteria D ecision Aid (M C DA) meth od is a model that allow s
for t h e analysis of several preference c riter ia simultaneously. Zopounidis
and Doumpos (1999b) a pply MCDA to sorti ng problems, where a set of
alternative actions is classified in to several predefined classes. B ased on the
multidimension al nature of financial r isk, Doumpos and Zopounidis (2000)
propose a new operational approac h called the M u lti-Grou p H ierarc h ical
Discrimin ation (M.H.D IS ) method — which originates from MCDA — to
determin e the risk classes to whic h the alterna tives belong . Usin g World
Bank data, the authors apply this method to develop a model which classifies
143 countries into four risk classes based on th eir econom ic perform an ce and
credit worthiness. T he authors con clude that this m eth od performs better than
tradition al m u ltiple discrimin ant analy sis.
11
11
There are several other models, not discussed in this section, such as Fuzzy Logic (FL)
techniques, Evolutionary Approach, and others.
22
MC DA is can be used in credit ra tings and ban k soundness. For exam ple,
Gaganis et al (2006) apply a MC DA model using the UTADIS method to
classify banks in to three groups based on their soundness. The s am ple
includes 894 banks from 79 coun tries, and the m odel is develo ped throug h
a tenfold cross-validation procedure. Their results sho w that asset qualit y,
capitalization and the mark et where banks operate are the most important
criteria in cla ssifyin g t he so un d ness o f ba nk s. Profitability and efficiency
are a lso importan t factors associated w ith banks performance. Furthermore,
they find that UTA D IS outperforms DA and Logit in terms of classification
accuracies. Zopounidis and Doum posi (1999a) also explore if the UTADIS
meth ods a re applicable for a naly zing business failure. They compare this
meth od to DA and stand ard Logit an d Probit statistical models.
Pasioura s et al (2007) test whether MC DA model can be used to replicate
the credit rating of Fitch on Asian banks. Fiv e financial and fiv e non-financial
variables measuring b ank and c ou ntry characteristics are inclu ded in the
model, and t h e model is tested thr ou gh a te nfold cross-validation. The
results show that ‘equity/customer and short-term funding, net interest margin
and return on a v erage equit y, are the most important financial variables.
The n umber of shareholders, the n umber of subsidiaries and the banking
en vironment of the coun try’ are the most im portant non-financial factors. T he
authors compare the accuracy of this prediction m odel with that of D A and
ordered Logit; they find that MCDA is more efficient and that it replicates the
Fitch credit ratings w ith the ‘satisfactory accuracy’.
Niem ira and Saat y (2004) use a m ultip le criteria decision-making model to
pre d ict the likelih ood o f a financial crisis based on an A n alytic N etwork Process
(ANP) framew ork. They test the m odel for the US bank crisis during 1990s,
and find that the ANP analysis provides a structure that can reduce judgmental
forecast error through im p roved reliab ility of information pr ocessing. They
conclude that the ANP framework is more flexible and is m ore comprehen sive
than traditional models, and it is a promising methodology to forecast the
proba bility of c rises.
Ng et al (2008) propose a Fu zzy Cerebellar Model Articu lation
Controller model (FCMAC) based on a compositional rule of inference called
FC M AC-CRI(S). T h e new arc hitecture integrates fuzzy systems and N N to
create a h ybrid structure called neu ral fuzzy networks. This new network
operates t hrough localized learning. I t tak es as inputs data f rom public
financial information and analyzes p atterns of financial d istress th ro ugh fuzzy
IF-TH E N r ules. Su ch processing can provide a basis for an EWS and
insights for various aspects of fina ncial distress. T he authors compar e the
accuracy of F CM A C-CRI(S) to Cox’s proportional hazard model and the
GenSoFNN-CRI(S) netw ork model and find that the performance of t he new
approach is better than that of the benc hm ark m odels.
23