Friday, May 14, 2010

A Game of Clue

In the parallel universe that most Hollywood movies are set in, global catastrophes are often masterminded by evil geniuses with sinister motives. The implicit rules of this nigh-incomprehensible world usually ensure that the audience is afforded a fleeting glimpse of the principal villain before the true nature and extent of the impending disaster is revealed. We find him in a darkened, opulent office, skulking in a high-backed leather chair; and apparently taking great care to make sure that nothing but the top of his head is visible over the back of his chair. In the portentous silence, a slowly curling wisp of smoke rises ponderously upwards from an expensive cigar held in a bejewelled hand.

We only really get to meet this villain once some hero has braved one unimaginable danger after another to arrive at his doorstep and have a word with him about what he’s doing to the world. The imposing leather chair now swings around smoothly, and we find ourselves face to face with a man whose appearance alone may have forced him into a career of extravagant criminal activities. Our villain sports slicked-back hair and a permanent sneer at the stupidity of the world around him; more often than not, a disfiguring scar of some sort graces his facial features. In Hollywood-land, if you’re the man behind the destruction of the world, you must look evil enough to be the man behind the destruction of the world.

But that’s just how Hollywood sees things.

In real life, global catastrophes sometimes just happen, in the complete absence of scheming criminal masterminds. Case in point: the global financial crisis that had begun to manifest itself in the world’s most powerful economies by around mid-2007. No one person caused the global financial crisis; rather, it was the complex interplay of the actions of several key individuals and institutions that led to the conditions of the crisis. Nevertheless, an examination of the facts and of the sequence of events could allow one to guess at which people were most culpable in bringing about the crisis. It’s a bit like a game of Clue, really.

Well. Let’s play, then.

Bubbles in the Economic Ocean

We begin our manhunt by contemplating the strange and (largely) inexplicable events that have come to be known as bubbles. For those completely uninitiated in the technical jargon that is usually used in discussing the global financial crisis, it should be pointed out right away that economic bubbles have about as much to do with the kind of bubbles you’d find in your bathtub as the physicist’s notion of work has to do with what you handed in last week as your homework.

An economic bubble has been defined as a condition where “trade in high volumes occurs at prices that are considerably at variance with intrinsic values”. What this basically means is that when an economic bubble is formed in the market for a particular commodity, a disproportionately large volume of that commodity is being produced and sold; and furthermore, the price at which the commodity is sold is considerably higher than its equilibrium value on the market. The equilibrium price of a commodity can be simply defined as a stable price at which the supply of the commodity consistently equals the demand for the commodity.

Bubbles are said to burst (or crash) when the market comes to its senses and “realizes” that too much of a commodity is being traded at inflated values. When this happens, both the price and the quantity of trade in the market fall drastically. Economic bubbles are notoriously difficult to identify (usually because the actual intrinsic values of assets in real-world markets are almost impossible to calculate). It is often only after a bubble has burst that economists are able to be absolutely certain that a bubble existed in the market in the first place.

Further adding to their mystique is the fact that no one really knows what causes economic bubbles. And interestingly enough, some economists even deny their existence. Nonetheless, an examination of the recent history of a few major markets in the developed world in terms of bubbles goes a long way in explaining what caused the global financial crisis. Of particular interest are the Dot-Com Bubble that burst in 2000, and the U.S. Housing Bubble that’s arguably still “bursting”. Let’s take a look at each one of these in turn.

The Internet Age Hits Adolescence

Chaos in the Stock Markets

Starting around the year 1998, the meteoric rise of a myriad of IT-related companies (collectively referred to as dot-coms) boosted the economies of nations throughout the developed world. Unfortunately, much of the economic value that these companies represented turned out to be- well, worthless. The rapid growth of many dot-coms was subsequently matched only by their sudden and spectacular failures. And while the dot-com collapses had widespread repercussions, it was in the stock markets that the blow to the economy was especially obvious. That’s where the Dot-Com Bubble had been residing, quietly biding its time, waiting for the opportunity to snatch the Internet Age out of its carefree childhood years.

According to the NASDAQ Composite index (which is a complicated tool that’s used to measure the performance of stocks on the NASDAQ Stock Exchange), the bubble burst on March 10th, 2000. Hundreds of dot-coms collapsed after burning through their venture capital, the majority of them never having made any net profit. “Get large or get lost”- the business model backed by the belief that internet companies’ survival depended primarily upon expanding their customer bases as rapidly as possible, even if it produced large annual losses- was revealed to be dangerously unsound advice. The crash of the Dot-Com Bubble caused the loss of around $5 trillion on U.S. stock exchanges, and exacerbated the conditions of the recession that occurred between 2001 and 2003.

Alan Greenspan to the Rescue?

Following the collapse of the Dot-Com Bubble, Federal Reserve Chairman Alan Greenspan initiated several policies in the United States. The U.S. Federal Reserve System (often referred to simply as The Fed) serves as the country’s central bank; it is comprised of twelve regional Federal Reserve Banks in major cities across the nation. The Federal Reserve manages the nation’s money supply and its monetary policy, and is responsible for attaining the (sometimes conflicting) goals of maximum employment, stable prices, and moderate long-term interest rates.

In the aftermath of the Dot-Com Crash, Greenspan set the federal funds rate at only 1% (for comparison, note that between 1999 and 2001, the rate had never been set below 4%). It’s been argued that this allowed huge amounts of “easy” credit-based money to be injected into the financial system, and therefore created an unsustainable economic boom. In other words, the economic growth that occurred between 2003 and 2007 is largely attributable to the excessively high level of credit that was sloshing around the economy at the time. All that credit wasn’t backed by enough actual assets, though; this began to become clear in mid-2007, and that’s when the whole house of cards came crashing down.

The Federal Funds Rate

In order to understand the role that the federal funds rate had to play in flooding the economy with credit, one must begin with the realization of a simple fact: that banks create money by lending. A comparison of two hypothetical scenarios will make this a lot clearer. In the first scenario, a fellow that we’ll call Christiano Kaka earns a $1000 bonus for his work as a pro footballer. But since he’s already got millions in his bank account, he figures that there’s no point in bothering to go down to the bank to deposit the money there.

Instead, he stuffs it under his mattress. He happens to be in the habit of losing his wallet, and this safety measure ensures that even if that were to occur again, he could readily get to the cash the next time he’s in the mood to hit the nightclubs. Now here’s the important thing: that $1000 is effectively dead for so long as it stays there under Christiano Kaka’s mattress. It plays no part in the economy, and doesn’t do anything useful for anybody.

In our second scenario, Christiano Kaka realizes that he’ll be travelling past the bank on his way to the nightclubs anyway, so he does deposit the money there. Christiano Kaka’s money gets added to a large pool of money composed of the deposits from all of the bank’s customers. When a young college dropout called Bill Jobs approaches the bank with his crazy schemes of starting a company that deals in personal computers, the bank’s manager decides to throw him a bone, and loans him $1000.

For our purposes, we might as well assume that the $1000 the bank loaned to Bill Jobs is the same $1000 that Christiano Kaka deposited earlier. But, of course, Christiano Kaka hasn’t lost that money; it’s still his, as he could prove by showing us his bank statement. It’s just that the money also happens to be Bill Jobs’ at the same time. As a matter of fact, the bank has created $1000 for Bill Jobs based on the $1000 that Christiano Kaka deposited. Where in the first scenario the $1000 was retired from the economy, in this second scenario it was used to create another $1000 that will go back into the economy (when Bill Jobs rents an office, buys furniture, pays employees, etc). And that’s how banks create money.


But they can’t just go around creating as much money as they please.

The law requires all banks to maintain a certain level of reserves, either as vault cash, or in an account with the Fed. The ratio of bank reserves to money loaned cannot be allowed to fall below the limit set by the Fed. Therefore, the amount of money that any particular bank can create depends upon the amount of actual money that it holds as deposits from customers. Now, whenever a bank makes a loan, the ratio of reserves to loans falls (assuming that reserves remain constant). A bank may decide to issue a loan large enough to cause its ratio to fall below the limit set by the Fed, but it must immediately raise the reserve ratio again by borrowing cash from other banks. The interest rate at which banks borrow from one another is known as the federal funds rate.

When the federal funds rate is as low as 1%, it becomes very cheap for banks to borrow from one another to make up for reserves deficits. Hence, in the interest of making profits, banks can give out much larger loans to many more people, and still remain on the right side of the law. And that’s exactly what happened in the U.S. economy. With banks handing out credit to any and all who cared to ask for it, the economy became flooded in virtual money. The benign economic conditions that prevailed between 2003 and 2007 created not only a nation of spenders, but a nation that spent money that it didn’t really have; a nation that buried itself under a mountain of debt.

Reagonomics Run Amuck

The deluge of credit-based free spending in the economy was the ultimate expression of a socio-political-economic ideology that had had America in its grips for more than twenty years. It was an ideology founded upon the belief that Government was the problem, not the solution. It was an ideology that declared in stentorian tones: The marketplace must be set free!

Reaganomics.

Initiated and popularized by President Ronald Reagan and his advisors in the ‘80s, the economic policies that came to be known as Reaganomics were aimed at reducing the role of the government in the economy, and allowing it to regulate itself instead. Reagan reduced taxes for the rich, decreased government oversight of financial markets, and ushered in an era of astounding fiscal irresponsibility. Reaganomics has been repeatedly criticized for raising economic inequality and throwing both the public and private sectors of the U.S. economy into massive debt.

Traditionally, the U.S. government ran significant budget deficits only in times of war or economic emergency. Federal debt as a percentage of G.D.P. fell steadily from the end of World War II until 1980; that’s when Reagan entered the scene with his own version of the New Deal[1]. Government debt rose steadily through Reagan’s two terms in office and- except for a short hiatus during the Clinton years- continued to rise right until George W. Bush left office in 2009.

The rise in public debt, however, was nothing compared to the skyrocketing private debt.

The pattern of financial deregulation that Reagan set in motion allowed American consumers access to ever-increasing amounts of credit (and hence ever-increasing levels of debt) for decades. America wasn’t always a nation of big debts and low savings: in the ‘70s, Americans saved almost ten percent of their income (even more than in the ‘60s). It was only after the Reagan-era deregulation that thrift gradually disappeared from the American way of life, culminating in the near-zero savings rate that prevailed just before the current economic crisis hit. Household debt was only 60 percent of income when Reagan took office; by 2007 it had zoomed to more than 130 percent.

It was only with the crash of the housing market in 2007- the second major shock to the U.S. economy in the last decade- that it would become painfully clear that wanton debt as a way of life would have to be abandoned.

If You Can't Understand 'Em, Don't Regulate 'Em



Returning to the antics of Alan Greenspan in the years following the Dot-Com Crash, we find that he was also responsible for vehemently opposing any regulation of financial instruments known as derivatives. He wasn’t alone in feeling that financial markets could regulate themselves just fine: Securities and Exchange Commission Chairman Arthur Levitt and Treasury Secretary Robert Rubin also held the same view. Together, in the Clinton years, they ensured that investment banks and other financial institutions were given free rein in creating and selling these complex financial instruments.

Nonetheless, those financial institutions have little to thank Greenspan and his cohorts for; they ended up crippling themselves by their involvement in an unregulated market for complex derivatives that no one fully understood. By 2008, large portions of the derivatives portfolios of major investment banks were reclassified as toxic assets. Lehman Brothers, Bear Stearns and Washington Mutual succumbed to the poison coursing through their veins and had to declare bankruptcy or sell off their assets under duress. Other major banks, such as the American International Group (AIG) and Citibank sustained huge losses and only managed to stay afloat with the help of the government.

Although most derivatives are relatively benign, the late ‘90s saw the proliferation of two particularly complex instruments that would later threaten the stability of the entire financial sector: collateralized debt obligations (CDOs) and credit default swaps (CDSs). Because these innovative new instruments offered lucrative payments in times of economic growth and rising asset prices, they spread like wildfire in the years leading up to the current financial crisis.

All derivatives derive their prices from the value of an underlying asset (except credit derivatives, which derive their prices from the values of loans). Investors can make profits on derivatives if they correctly anticipate the direction that the prices of underlying assets will move. Since hardly anyone had foreseen the appalling conditions of the current crisis, it’s probably fair to say that losses were made on derivatives of all kinds, but it was the fact that CDOs and CDSs became extremely popular in the (at the time) flourishing housing market that resulted in their posing such a great threat to the stability of many financial institutions.

The risks of subprime mortgages in the housing market (which we’ll come back to later) were commonly spread out using CDOs and CDSs. It seemed to be a win-win situation for everyone involved: investment bankers could get in on the profits from the housing market, mortgage lenders could spread the risks of questionable loans, and American consumers benefitted from an infrastructure that encouraged offering home financing for all.

But once the Housing Bubble burst and house prices crashed down to nearly-inconceivable levels, those groups were left staring stupidly at one another.

The valuation of CDOs and CDSs is actually so complicated that no one could say for sure what they were worth once the housing market crashed. And without a price that all parties could agree upon, the markets for these derivatives became completely dysfunctional. As a result of the unfortunate marriage between questionable loans and the complex derivatives that securitized those loans, many American citizens lost their homes through foreclosures; mortgage lenders lost billions in loan defaults; and investment banks found themselves laden down with assets that were literally more trouble than they were worth.

Then the government stepped in to clean up the mess.

If only it had done that on a regular basis (and amidst less catastrophic circumstances) through the implementation of a more rigorous regime of financial market regulation.
Houses Under the Sea
A Cycle of Folly

It is a regrettable fact that the powers that be in America and other developed nations seem to have very short memories when it comes to the economy. Time and again, the painful lessons learnt during economic hardship were thrown out the window once things turned for the better. The stewards of the American economy would thereby doom themselves and their compatriots to relive the suffering born of past mistakes by making the same mistakes again and expecting different results. Only once catastrophe struck again would they realize that they had brought it upon themselves by ignoring the experiences gained from the last such catastrophe. But catastrophes don’t last forever; and as the most recent one faded into the past, the collective knowledge gained from the last two would disappear from the consciousness of the nation…




And so the cycle of folly that has played such an important role in determining the economic fortunes of the nation has gone on.

It began with the Great Depression. One of the most important initiatives taken to revive the economy under President Roosevelt’s watch was the ratification of the Glass-Steagall Act in 1933. The Act provided for more stringent regulation of the banking sector, and aimed to prevent a repeat of the banking collapse of early 1933. Among other things, it prohibited any one institution from acting as a combination of a commercial bank, an investment bank, and an insurance company; it gave the Fed the ability to control interest rates; it created the Federal Deposit Insurance Corporation (FDIC) to insure bank deposits in commercial banks; and it imposed stringent restrictions on mortgage lending.

Roosevelt also spurred the government to increase spending in the economy as part of his New Deal programs. The drop in public expenditure that marked the cessation of these programs created another recession in 1937. From then onwards, significant decreases in public expenditure regularly led to dismal economic conditions. This happened again in 1953 when large portions of public spending were transferred to national security projects during the Korean War. President John F. Kennedy managed to halt the recession of 1960 by calling for increased public spending in the economy. In the ‘70s, the diversion of funds to the military during the Vietnam War (alongside the quadrupling of oil prices by OPEC in 1973) created another major recession.

And then came the ‘80s. Ronald Reagan took the helm at a time of economic prosperity (“It’s morning again in America!” he would proclaim), and undertook the most radical overhaul of the nation’s economic policies since F.D.R.’s New Deal. He convinced American citizens that their government had no business prying into the affairs of the market, and initiated sweeping cuts in public expenditure throughout the economy. And, perhaps more significantly, he overturned many of the regulatory policies that Roosevelt had set in place.

Reagan ratified the Depository Institutions Deregulation and Monetary Control Act and the Garn-St. Germain Depository Institutions Act, both of which sought to repeal parts of the regulatory provisions of the Glass-Steagall Act. Of the Garn-St. Germain Act, Reagan said: “This bill is the most important legislation for financial institutions in the last 50 years.” He may have been right about the significance of the Act, though he probably never intended for it to have the effect it eventually did. By liberalizing mortgage lending and the Savings and Loans industry, Garn-St. Germain paved the way towards a debt-ridden American economy that would be woefully unfit to weather the economic storms of the new millennium.

In the ‘90s, the American lifestyle of living beyond one’s means through the use of cheap credit was considered justifiable because once one took into account the rising values of people’s stock portfolios, everything seemed just fine. It was during this time that the final blow to the Glass-Steagall Act came in the form of the Gramm-Leach-Bliley Act of 1999. This Act allowed commercial banks, investment banks, securities firms and insurance companies to consolidate and form conglomerates.

It was believed that the conflicts of interest between these different kinds of institutions that the Glass-Steagall Act had sought to prevent would no longer be a problem in a flourishing financial sector. Nonetheless, it was because of the Gramm-Leach-Bliley Act that banks such as AIG and Citigroup (which started out as Citicorp, a commercial bank, and became a financial services conglomerate by merging with Travelers Group only after the passing of the Act) managed to get embroiled in the problems that the mortgage industry began to face after 2007. And since these two banks- and others like them- had become “too big to fail”, the government had to spend millions of taxpayer dollars in keeping them afloat.

As we’ve already noted, the stellar performances of U.S. stock markets did come to an end in 2000, with the bursting of the Dot-Com Bubble. But once the economy recovered and growth set in between 2003 and 2007, Americans returned to their free-spending ways. This time, they reasoned that a booming housing market would support their costly habits, just as they had assumed with the stock market before 2000. If anything, they were even more confident this time round. Housing was an infallible investment, right?

Wrong.


The Housing Bubble Expands



As we explore the most proximate cause of the global financial crisis- the bursting of the U.S. Housing Bubble- you’ll begin to see why it was necessary to start our discussion with the Dot-Com Crash, and to jump back and forth in time as often as we have. In a very real sense, the Housing Bubble was caused by the Dot-Com Bubble. The economic conditions that led to the formation of the Housing Bubble were created during and after the crash of the Dot-Com Bubble. Similarly, the legislation and economic policies that came into effect at that time had their roots in the policies of several decades ago; and their effects extend into the present day.

For one hundred years, between 1895 and 1995, U.S. house prices rose in line with the rate of inflation. Then, between 1995 and 2005, the Housing Bubble began to envelop the economy and house prices across the country rose at phenomenal rates. During this time, the price of the typical American house rose by 124 percent; house prices went from 2.9 times the median household income to 4.6 times household income in 2006. Where the average number of ho
uses built and sold before 1995 was 609,000, by 2005 that figure had risen to 1,283,000.



Housing appeared to be outperforming nearly every other sector of the U.S. economy, and there were those who would have us believe that it would continue to do so ad infinitum. Influential personalities such as David Lereah, the chief economist of the National Association of Realtors, regularly trumpeted the rock-solid dependability of housing as an investment; consider the title of his bestselling book- Are You Missing the Real Estate Boom? The media joined in, too, and helped inflate the bubble by glamorizing the housing boom with television programs such as House Hunters and My House is Worth What?

Even amidst all the frenzy, however, a small number of astute observers managed to figure out what was actually happening. In 2002, economist Dean Baker was the first to point out the existence of a bubble in the housing market; he put the value of the bubble at $8 trillion. And what’s even more impressive is that he correctly predicted that the collapse of the bubble would lead to a severe recession, and would devastate the mortgage lending industry.



The unreasonably high level of confidence in the housing market, coupled with the lenient regulations that governed mortgage lending, caused the number of mortgage-backed home purchases in the U.S. to shoot upwards after 2003. The real problem, however, was the fact that mortgage lenders got greedy, and began to offer mortgages to thousands of people who had little ability to repay them. Mortgage lenders are expected to assess the suitability of clients by checking their credit histories, income levels, and other relevant factors; but this process was often overlooked (or only nominally undertaken) in the heady years of the housing boom.




All this irresponsible lending created a huge market for what are known as subprime mortgages. Calling them “subprime” is a euphemistic way of saying that they’re extremely risky loans, and that there’s a high probability that they won’t be paid back. It was to people who didn’t qualify for “prime” loans that the mortgage lenders offered the subprime mortgages (usually at higher interest rates than “prime” mortgages). Lenders such as Countrywide, Indymac Bank, and Beazer Homes became notorious for the aggressive manner in which they marketed subprime mortgages to low-income consumers. (Two of those companies later declared bankruptcy, and one of them is being investigated for mortgage fraud).




Worsening the situation was the fact that nearly 80 percent of the subprime mortgages issued in the last few years were adjustable-rate mortgages (ARMs). Created by the World Savings Bank in the 1980s, the ARM seemed an innocent enough offering until the housing market turned sour in 2007. The interest rate on ARMs doesn’t remain constant throughout the term of the loan; instead, it’s tied to any one of several indices that measure economic performance.

In March 2007, house prices began to plummet, falling 13 percent in that single month. It was, however, only the beginning of a prolonged market correction that hasn’t quite ended yet. As the housing market began to deflate nationwide, the interest rates on ARMs climbed steadily upwards. Millions of homeowners across the U.S. found themselves unable to pay the higher interest rates on their mortgages, and were forced to default on their loans. This resulted in banks and mortgage lenders foreclosing those homes- in other words, throwing the former homeowners out and assuming ownership of the properties. By July 2009, more than 1.5 million homes had been foreclosed, and another 3.5 million are expected see the same fate by the end of the year.

Following closely on the heels of the “victims” of foreclosure are those that are referred to as being “under sea level”. These are the homeowners whose mortgage loans are now worth more than their houses are worth. The technical term for this phenomenon is negative equity. The unfortunate homeowners who find themselves in this position couldn’t even get enough money to clear their mortgage debts if they sold off their homes; they are, therefore, extremely vulnerable to facing foreclosure in the near future. As of December 2008, there were 7.5 million homeowners under sea level. Another 2.1 million people stood right on the brink, with homes worth only 5 percent more than their mortgages.

The Misfortunes of the Mortgage Lenders

The drastic fall in house prices first affected those financial institutions that were directly involved with the housing industry- the banks and corporations that financed house construction and mortgage lending. As the number of foreclosures soared, these companies lost millions of dollars in unredeemable loans. And if you’re thinking that at least they were left with the properties that they gained through foreclosure- well, that didn’t exactly help a great deal.

Look at it this way: in 2005, a bank puts up a $10,000 ARM so that a nice young couple can buy a house and start a family. The bank expects to profit on this investment through the interest payments it will receive over, let’s say, the next ten years. Since the economy is chugging along just swimmingly, the interest rate on the ARM is kept relatively low. But that’s okay from the bank’s point of view, because the value of the property itself makes up for the low interest rate. You see, even in the regrettable event that the new homeowners fail to keep up on their payments and the bank has to foreclose on the property, it ends up with a very marketable house that’s probably worth even more than the $10,000 it initially cost. Nice.

Once house prices began on their steep decline, though, and the interest rates on ARMs reset at much higher levels, things got ugly. There wouldn’t really have been a problem if every mortgagor still had the ability to pay the interest on his mortgage; but the aggressive subprime lending of the last few years- even to people who didn’t really qualify for mortgages- meant that there were millions of homeowners who just couldn’t pay the higher interest rates, and were forced to default on their loans.

After the inevitable foreclosures that followed, banks and mortgage lenders were left in possession of houses that nobody wanted to buy and were now worth almost nothing. Returning to our earlier example, we’d find that the bank would have lost nearly the entirety of the $10,000 that it initially put up; it would lose out on interest payments after foreclosure, and would be left with a house that could hardly even sell for five hundred dollars.

Hence, it’s no wonder that twenty-five major subprime lenders (several of them Fortune 500 companies) had to declare bankruptcy between 2007 and 2008.

Introducing Fannie, Freddie and the Credit Crunch

Next in the line of fire were the companies that dealt in the trade and securitization of mortgages. The two giants in this industry had come to be known as Fannie Mae and Freddie Mac. The quirky names come from the acronyms that represent the full names of each one of them: FNMA (Federal National Mortgage Association) for Fannie Mae and FHLMC (Federal Home Loan Mortgage Corporation) for Freddie Mac. Both Fannie and Freddie were Government Sponsored Enterprises (GSEs), meaning they operated in a sort of grey area between the public and private sectors.

Fannie Mae and Freddie Mac were responsible for buying mortgages from mortgage lenders, and creating and selling mortgage-backed securities (MBSs). By buying mortgages, they provided banks and other financial institutions with fresh money to make new loans; and by creating and selling MBSs, they created a secondary mortgage market that investment banks and securities traders could participate in. The primary purpose of all this was to the give the American housing and credit markets increased flexibility and liquidity. Fannie and Freddie were so deeply enmeshed in the housing market that by 2008 they owned $5.1 trillion in residential mortgages- about half the total U.S. mortgage market.

The final link in the chain consisted of the members of the shadow banking system- investment banks such as Lehman Brothers, Bear Stearns and Goldman Sachs. They traded in MBSs in the secondary mortgage market, and insured pools of mortgages using ridiculously complex financial instruments such as CDOs and CDSs. These derivatives were sold back to mortgage lenders and to commercial banks throughout the economy. Investment banks don’t actually hold any cash reserves, but their influence on the economy became increasingly important as the nation’s financial sector loaded up on debt and the financial instruments that backed up that debt.

Therefore, when the subprime mortgage industry imploded, it wasn’t only the mortgage lenders who were affected. An entire industry that dealt in mortgage-backed securities went down with it; Fannie Mae and Freddie Mac had to be effectively nationalized to prevent their complete collapse. The investment banks that purported to spread the risks of the mortgage industry also sustained huge losses because they had completely failed to foresee the effects of the housing market crash. And finally, banks across the country that held portfolios of credit derivatives were left with worthless, “toxic” junk

These huge losses across the financial sector created what was known as the “Credit Crunch”. Billions of dollars of capital that had been based on the housing market were wiped off the balance sheets of banks and other financial institutions. This left them with very little ability to extend new credit to consumers. It was at this point that that the crisis was said to extend its reach from “Wall Street to Main Street”, meaning that it no longer affected just the major financial institutions, but was now impacting the lives of citizens throughout the country.

As credit streams began to freeze, the entire economy slowed, and then descended into recession. Businesses began to shut down and unemployment rose. Investment and consumer spending plummeted. Alongside the U.S., other developed nations experienced similar symptoms. And with the economies of the developed world in tatters, developing nations lost major sources of manufacturing revenue; their economies began to slow, too.
All in all, the global financial crisis had arrived.

And the Rest is History

Through most of 2008, the U.S. government scrambled to contain the crisis. It initiated the Troubled Asset Relief Program (TARP) to help financial institutions get rid of their toxic assets, and injected nearly $800 billion into the economy through a stimulus package. The worst economic crisis since the Great Depression isn’t about to go down without a fight, though; many experts believe the economy won’t fully recover right until 2011.

But let’s not bother ourselves with speculations about the future. Instead, let’s go back to what we had initially set out to do: have we managed to identify the criminal mastermind behind the global financial crisis? No. Of course not. While we probably managed to gain a few interesting insights into the causes of the crisis, we never really came close to achieving that goal. If anything, we should have come to the conclusion by now that it’s ridiculous to assume that there was any one person behind it all.

Real life is far too boring for that.





[1] In an effort to resurrect the U.S. economy at the height of the Great Depression, President Franklin Delano Roosevelt initiated a sweeping range of economic reforms between 1933 and 1935 that collectively became known as the New Deal. In contrast to Reagan’s policies, Roosevelt’s New Deal stressed the importance of fiscal responsibility and government oversight of the economy.

No comments:

Post a Comment

Pages

Followers