Our next Career-in-Risk event is scheduled for March 31st, 2016 at the Reagan Building, Washington DC.
Please see the flyer below for more information.
We welcome all young professionals, students and those aspiring to enter the Financial Risk industry to join us for this interesting conversation.
Steven Lee, Chair
The Financial Risk Institute (FinRisk)
The Financial Risk Institute (FinRisk) London, will be having a special event, a debate in conjunction with the Bank Treasury Risk Management (BTRM) on Thursday, June 4, 2015 @ 1730 GMT.
The debate will be on topical bank balance sheet risk management issues of the day. This event will be live streamed and FinRisk participants globally, including those in the US and Asia are welcomed to join.
Event and registration details are now available HERE. Please follow the link for more information and registration.
Article by Mayra Rodríguez Valladares, managing principal at MRV Associates, a capital markets and financial regulatory consulting and training firm in New York.
This article first appeared on NY Times
and is posted here with permission of the author.
– March 2015
With all the new laws and regulations since the financial crisis, it would be easy to believe that the banking industry is safer. Unfortunately, speakers at the Federal Reserve conference at George Washington University on Friday offered a range of reasons for why that’s not the case.
First, the regulatory framework remains fragmented. Not only do we have “a dual state and federal banking charter system,” as former Representative Barney Frank told the audience of regulators, bankers, lobbyists, consultants and academics, we also have three national bank regulators, 50 state bank regulators and two derivatives regulators, not to mention different regulators for securities, broker-dealers and insurance companies. Private equity and hedge funds remain largely unregulated. It is unreasonable to expect that all these entities would communicate, not to mention work well together, to detect the next crisis.
In his keynote speech, the former Federal Reserve chairman, Paul A. Volcker, said that he continued to ask people “if they like our regulatory system — and cannot find anyone who does.” He said that he and his research group, the Volcker Alliance, “will propose that the U.S. regulatory framework be streamlined,” but added that the proposal was likely to meet significant resistance from both the regulatory agencies and politicians, who all have a stake in the current structure.
Lack of openness throughout the financial sector, not just in banking, was also a significant problem that led to the crisis and will continue to thwart regulators in years to come. “During the crisis, it was very difficult to get access to information,” said Coryann Stefansson, managing director at PricewaterhouseCoopers. “It is a myth that you can get everything from a bank.”
But while the Basel Committee on Banking Supervision has revised its requirements for important bank risks disclosures, they are unlikely to go into effect until next year in the 28 member countries — and even then, enforcement will not be necessarily uniform.
And in the United States, a significant number of financial institutions that are not banks do not report financial data to the Treasury’s new Financial Stability Oversight Council. Among those who do report, reporting is not done in a uniform manner, which would enable regulators to have a consistent view of risk in the system. Current political pressure to raise the $50 billion asset threshold above which banks face greater regulatory scrutiny may make it harder to see how complex and interconnected some firms are.
At the conference, Paul Saltzman, president of the Clearing House Association, a trade group, said that bank stress tests were “an important Federal Reserve Bank regulatory tool.”
But many of those looking at the results from the latest stress tests and Comprehensive Capital Assessment Review were most interested in whether banks could pay dividends. In the meantime, they were not looking at how the tests depended on unreliable, inaccurate data that rendered capital ratios, living wills data and stress test numbers meaningless. For two decades, I have seen banks of every size struggling with data collection, validation, ratio calculations and use of models. Unfortunately, the Basel Committee on Banking Supervision confirmed my concerns in January. Its survey results showed that nearly seven years after the global financial crisis, banks’ technology is not robust enough to give risk managers and regulators a full view of how a bank is coping with its credit, market, operational and liquidity risks.
Another significant problem is that many of the systemically important banks have not changed structurally. Arthur E. Wilmarth Jr., a law professor at George Washington University, told conference attendees that even though “returns on equity for most large global banks are well below 10 percent,” banks like Goldman and JPMorgan Chase “are not changing their business models.” In fact, some banks like Citibank are increasing their risk exposures. “Citibank has increased its footprint in derivatives and commodities,” he said. “Citi does not have a good track record in trading.”
Unfortunately, large banks remain insufficiently capitalized to sustain unexpected losses, which could affect ordinary taxpayers. Anat R. Admati, a finance professor at the Stanford Graduate School of Business, said at the event that banks were “not the only risk takers in our economy,” adding, “Entrepreneurs in Silicon Valley take risks all the time.” Yet those entrepreneurs, unlike big banks, will not be bailed out by taxpayers, who share in banks’ downside but not in their upsides. Additionally, as Professor Admati said, “Capital ratios and stress tests are based on risk weights and complex models, which can have perverse effects such as biasing banks’ decisions away from worthy business lending and which can increase the interconnectedness and fragility of the system.”
Professor Admati said she was also concerned about how much market participants and even regulators “know about banks’ interconnections to each other and the extent of their derivatives positions.”
This unanswered question is what concerns financial reformers — and if taxpayers could learn about banks’ activities, it would worry them as well.
Steve H. Hanke, Professor of Applied Economics and Co-Director of the Institute for Applied Economics, Global Health, and the Study of Business Enterprise at the Johns Hopkins University in Baltimore.
(Twitter: @Steve_Hanke)
This article first appeared in GlobeAsia April 2015, and posted here with permission from the author.
We are still in the grip of the Great Recession. Economic growth remains anemic and below its trend rate in most parts of the world. And what’s more, this state of subdued economic activity has been with us for over seven years.
In the U.S. (and elsewhere) the central bank created a classic aggregate demand bubble that became visible in 2004. The Fed’s actions also facilitated the creation of many market-specific bubbles in the housing, equity, and commodity markets. These bubbles all dramatically burst, with the bankruptcy of Lehman Brothers in September 2008.
The price changes that occurred in the second half of 2008 were truly breathtaking. The most important price in the world — the U.S. dollar-euro exchange rate – moved from 1.60 to 1.25. Yes, the greenback soared by 28% against the euro in three short months. During that period, gold plunged from $975/oz to $735/oz and crude oil fell from $139/bbl to $67/bbl.
What was most remarkable was the fantastic change in the inflation picture. In the U.S., for example, the year-over-year consumer price index (CPI) was increasing at an alarming 5.6% rate in July 2008. By February 2009, that rate had dropped into negative territory, and by July 2009, the CPI was contracting at a -2.1% rate. This blew a hole in a well-learned dogma: that changes in inflation follow changes in policy, with long and variable lags. Since the world adopted a flexible exchange-rate “non-system”, changes in inflation can strike like a lightning bolt.
True to form, central bankers have steadfastly denied any culpability for creating the bubbles that so spectacularly burst during the Panic of 2008-09. What’s more, they have repeatedly told us that they have saved us from a Great Depression. For central bankers, the “name of the game” is to blame someone else for the world’s economic and financial troubles.
To understand why the Fed’s fantastic claims and denials are rarely subjected to the indignity of empiri- cal verification, we have to look no further than Milton Friedman. In a 1975 book of essays in honor of Friedman, Capitalism and Freedom: Problems and Prospects, Gordon Tullock wrote:
….it should be pointed out that a very large part of the information available on most government issues originates within the government. On several occasions in my hearing (I don’t know whether it is in his writing or not but I have heard him say this a number of times) Milton Friedman has pointed out that one of the basic reasons for the good press the Federal Reserve Board has had for many years has been that the Federal Reserve Board is the source of 98 percent of all writing on the Federal Reserve Board. Most government agencies have this characteristic….
Friedman’s assertion has subsequently been supported by Larry White’s research. He found that, in 2002, 74% of the articles on monetary policy published by U.S. economists in U.S.-edited journals appeared in Fed-sponsored publications, or were authored (or co-authored) by Fed staff economists.
The explanations of the Great Recession have been all over the map and surprisingly incoherent. For example, Andrew Lo reviewed twenty- one books on the topic in the January 2012 issue of the Journal of Economic Literature. He found that the main- stream, post-crisis approach amounts to an ad hoc approach that lacks a unifying theory or theme. That said, the literature – from Alan Greenspan’s 2013 book The Map and The Territory to the 2009 tome Animal Spirits by George Akerlof and Robert Shiller – is punctuated with a great deal of con- jecture about changes in investors’ animal spirits and how these wrecked havoc on the financial markets and the economy during the Panic of 2008-09 and the ensuing Great Recession. Much of this is borrowed from a recent fashion in economics: behavioral finance.
But, this line of argument goes back to earlier theories of the business cycle; theories that stress the importance of changes in business sentiment. For example, members of the Cambridge School of Economics, which was founded by Alfred Marshall, all concluded that fluctuations in business confidence are the essence of the business cycle. John Maynard Keynes put great stress on changes in confidence and how they affected consumption and investment patterns. Frederick Lavington, a Fellow of Emmanuel College and the most orthodox of the Cambridge economists, went even further in his 1922 book, The Trade Cycle. Lavington concluded that, without a “tendency for confidence to pass into errors of optimism or pessimism,” there would not be a business cycle.
So, the mainstream approach fails to offer much of a theory of national income determination. The monetary approach fills this void. Tim Congdon – a master of the high theory of monetary economics and all the knotty practical details of money and banking, too – has supplied us with a most satisfying and comprehensive treatment of the causes of business cycles and the current Great Recession. Three of Congdon’s most important books are listed in the references below.
The monetary approach posits that changes in the money supply, broadly determined, cause changes in nominal national income and the price level (as well as relative prices – like asset prices). Sure enough, the growth of broad money and nominal GDP are closely linked. Indeed, the data in the following chart speak loudly.
By observing the plunge in broad money, the contraction in the U.S. economy was, therefore, inevitable and in line with the monetary approach to national income determination (see the accompanying chart). But why has post-crisis growth remained so low? Well, because the growth in broad money has remained well below its trend rate. Indeed, Divisia M4 is only growing at a 2.7% year-over-year rate. Why? Since the crisis, the policies affecting bank regulation and supervision have been massively restrictive. By failing to appreciate the monetary consequences of tighter, pro-cyclical bank regulations, the political chattering classes and their advisers have blindly declared war on bank balance sheets. In consequence, bank money, which accounts for be- tween 70% to 90% of broad money in most countries, has contracted or failed to grow very much since the crisis. Since bank money is an elephant in the room, even central bank quantitative easing has been unable to fully offset the tightness that has enveloped banks. Looking at the U.S., there is a ray of hope, however: credit to the private sector has finally started to grow above its trend rate.
“The monetary approach offers a coherent theory of national income determination – one that stands up to the test of empirical verification.”
The money and credit picture in the Eurozone has been much more dire than in the U.S. In fact, this has resulted because, until recently, the European Central Bank has refused to offset, with quantitative easing, the regulatory-induced tight monetary policies imposed on banks. That is now changing and credit to the private sector is about ready to become positive for the first time since 2012 (see the accompanying chart).
China, the last of the Big Three, has a monetary problem (as well as others). Broad money has been growing at below its trend rate since 2012, and it’s becoming weaker with each passing month (see the accompanying chart). In an attempt to pump up the growth rate in broad money, China’s central bank has cut interest rates twice since last November and lowered reserve requirements for banks. But so far, it has proven to be too little, too late. Not surprisingly, official forecasts for China’s 2015 GDP growth have been dialed back.
The monetary approach offers a coherent theory of national income determination – one that stands up to the test of empirical verification. It explains the Panic of 2008-09 and the ensuing Great Recession with ease. The monetary approach reigns supreme.
References:
• Congdon, Tim. Money and Asset Prices in Boom and Bust. London: Institute of Economic Affairs, 2005.
• Congdon, Tim. Central Banking in a Free Society. London: Institute of Economic Affairs, 2009.
• Congdon, Tim. Money in a Free Society Keynes, Friedman, and the New Crisis in Capitalism. New York: Encounter Books, 2011.
“…in the last few weeks, typical technology columns on major news networks, which cover major technology trends and news, were instead flooded with news of cyber attacks, threats and vulnerabilities….welcome to a world less safe everywhere you look…”
CYBERSECURITY NEWS HEADLINES
Of late, I have noticed that few days pass by without some major cyber security incidents and developments, and many get headline news coverage. We are already familiar with security attacks and breaches reported at Anthem, Sony, Target, Home Depot, JPMorgan, and many more. Yet, just in the last few weeks, typical technology columns on major news networks, which normally cover major technology trends and news, were instead flooded with news of cyber attacks, threats and vulnerabilities. These are some of the headlines we find in columns covering Technology trends in just the recent weeks:
- “Welcome to a World Less Safe Everywhere You Look”
- “U.S. creates new agency to lead cyber threat tracking”
- “Cyber attacks top US threat list”
- “JPMorgan Goes to War (with Cyber Security Team)”
- “(Singapore) Government to set up Cyber Security Agency in April”
- “Spying Campaign Bearing NSA’s Hallmark Found Infecting Thousands of Computers”
- “Sim card firm confirms hack attacks”
- “FBI Puts $3 Million Bounty on Russian Hacker Tied to Heists”
- “Samsung hit by latest smart TV issue”
- “Account data stolen in TalkTalk hack”
- “China ‘drops US technology firms’ (from China’s official list of approved products)”
- “Lenovo victim of cyber-attack”
- “FBI Is Close to Finding Hackers in Anthem Health-Care Data Theft”
- “Intuit Denies Putting Profit Before TurboTax-User Security”
- “Malicious Emailers Find Healthcare Firms Juicy Prey”.
BOARD & EXECUTIVE MANAGEMENT RESPONSE
This is a very disturbing trend. Many senior executives and board directors are at a lost on how they can effectively respond to this “new” challenge. For sure, this is by no means “new”. Yet, the pace at which new attacks are hitting corporations, and the destructive effects and negative publicity are getting senior executive management and board directors extremely nervous.
Most senior executives and board directors have very limited or no understanding of what is seen as the technical nature of cyber security threats. Yet, they are no more difficult to understand than many of the requirements in the Dodd-Frank Act, such as the Comprehensive Capital Analysis and Review (CCAR) Stress Testing, Modeling, estimation of PD’s, LGD’s, and other complexities are perhaps equally if not more complex than cyber threats. Once we remove the fluff of complexities that many specialists would like to throw at the less informed senior executives and board directors, things become a lot more comprehensible. We just need to learn how to connect the dots. Pleading ignorance and incompetence will no longer suffice.
While senior executives and board directors are not expected to know all the technology details surrounding cyber threats and security, “sufficient” education is needed so that they can discharge their oversight and management responsibilities effectively, or at least meet basic fiduciary expectations of stakeholders, especially regulators. Given the massive deployment of all sorts of technologies in today’s enterprises, it will indeed be hard to even know where to start looking. Basic education, internal controls and processes (e.g. password management, segregation of duties, etc.) are important, but they will not adequately cover what is needed for the threats that enterprises face today. Beyond those basic controls, it would be important to look at current state of security exploits.
SECURITY EXPLOIT THEMES
Many security providers have good frameworks and methodologies to address these challenges. It would be worthwhile though to point out just some key thoughts from recent cyber-security research. The recently released HP Security Research’s Cyber Risk Report 2015, identified several key themes that are helpful in understand today’s cyber security threats, and areas that everyone needs to keep a lookout out for. Below are comments extracted from the above 2015 Report on key themes found in HP’s studies, with some of my thoughts and edits:
Theme #1: Well-known attacks still commonplace
Attackers continue to leverage well- known techniques to successfully compromise systems and networks. Many vulnerabilities exploited took advantage of code written many years ago—some are even decades old. Attackers continue to leverage these classic avenues for attack. Exploitation of widely deployed client-side and server-side applications are still commonplace. These attacks are even more prevalent in poorly coded middleware applications, such as software as a service (SaaS). While newer exploits may have garnered more attention in the press, attacks from years gone by still pose a significant threat to enterprise security.
Suggested Action: Businesses need to employ a comprehensive patching strategy to ensure systems are up to date with the latest security protections to reduce the likelihood of these attacks succeeding.
Theme #2: Misconfigurations are still a problem
Many vulnerabilities reported were related to server misconfiguration. Access to unnecessary files and directories seems to dominate the misconfiguration-related issues. The information disclosed to attackers through these misconfigurations provides additional avenues of attack and allows attackers the knowledge needed to ensure their other methods of attack succeed.
Suggested Action: Regular penetration testing and verification of configurations by internal and external entities should be performed to identify configuration errors before attackers exploit them.
Theme #3: Newer technologies, new avenues of attack
New technologies bring with them new attack surfaces and security challenges. This past year saw a rise in the already prevalent mobile-malware arena. Even though the first malware for mobile devices was discovered a decade ago, 2014 was the year when mobile malware stopped being considered just a novelty. Connecting existing technologies to the Internet also brings with it a new set of exposures. Point-of-sale (POS) systems were a primary target of multiple pieces of malware in 2014. As physical devices become connected through the Internet of Things (IoT), the diverse nature of these technologies gives rise to concerns regarding security, and privacy in particular. Even as we look at new technologies, we must not forget older technologies that are still used pervasively. Many of these have very basic and structural failures and loopholes, like hardcoded passwords, etc. which leave them extremely vulnerable.
Suggested Action: Enterprises should understand and know how to mitigate risks being introduced to a network prior to the adoption of new technologies. Enterprises should also take stock older technologies, understand vulnerabilities that these might pose and take necessary mitigating measures to remove as many vulnerabilities as possible, and mitigate residual risks as needed.
Theme #4: Gains by determined attackers
Attackers are persistent and use both old and new vulnerabilities to penetrate all traditional levels of defenses. They maintain access to victim systems by choosing attack tools that will not show on the radar of anti-malware and other technologies. In some cases, these attacks are perpetrated by actors representing nation-states, or are at least in support of nation-states. In addition to the countries traditionally associated with this type of activity, newer actors such as North Korea were visible in 2014.
Suggested Action: Network defenders should understand how events on the global stage impact the risk to systems and networks.
Theme #5: Cyber-security legislation on the horizon
Activity in both European and U.S. courts linked information security and data privacy more closely than ever. As legislative and regulatory bodies consider how to raise the general level of security protection in the public and private spheres, the avalanche of reported retail breaches in 2014 spurred increased concern over how individuals and corporations are affected once private data is exfiltrated and misused, such as incidents re: Anthem, Sony and Target.
Suggested Action: Companies should follow developments and stay up-to-date on new cyber security and related legislation and regulation. These, among other things, will impact how companies must monitor their assets and report on potential incidents.
Theme #6: The challenge of secure coding
The primary causes of commonly exploited software vulnerabilities are largely facilitated by defects, bugs, and logic flaws. Security professionals have discovered that most vulnerabilities stem from a relatively small number of common software programming errors. Much has been written to guide software developers on how to integrate secure coding best practices into their daily development work. Despite all of this knowledge, we continue to see old and new vulnerabilities in software that attackers swiftly exploit.
Suggested Action: It may be challenging, but it is long past the time that software development should be synonymous with secure software development. Management should insist that third-party products/ services used or internal software developments have this as a key requirement, and that developers concerned should demonstrate how this requirement has been effected. While it may never be possible to eliminate all code defects, a properly implemented secure development process can lessen the impact and frequency of such bugs and exploits.
Theme #7: Complementary protection technologies
In May 2014, Symantec’s senior vice president Brian Dye declared antivirus dead and the industry responded with a resounding “no, it is not.” Both are right. Mr. Dye’s point is that AV only catches 45 percent of cyber-attacks—a truly abysmal rate. The 2014 threat landscape shows that enterprises most successful in securing their environment employ complementary protection technologies. These technologies work best when paired with a mentality that assumes a breach will occur instead of only working to prevent intrusions and compromise.
Suggested Action: By using all tools available and not relying on a single product or service, defenders place themselves in a better position to prevent, detect, and recover from attacks.
CONCLUSION
Cyber attacks will continue, and many corporations are starting to take cyber security a lot more seriously. In part, this points to the need to catch-up on the past’s lack of attention on something so fundamental as security of the enterprise technology infrastructure, including third-party solutions and outsourced services.
Technology deployment will continue to occur at lightning speed. Controls and security cannot be allowed to fall behind. In the pursuit for new revenue channels and greater profits through the exploitation of new technologies and solutions, senior executives and board directors need to be cognizant of the “industrial strength” of these products and solutions. Technology and solution providers must be held accountable for delivering products and services that meet requirements. While it is true that we can never plug all security holes, many of the current attacks and breaches are created by basic failures, program bugs, and sloppy programming methods. Holding third-party providers accountable for more secure solutions, and ensuring internal teams bolt down systems through proper configurations, along with basic internal controls and security practices will go a long way to minimize loopholes and points of attacks. Having said so, this will be a long process that will require transformational changes across the enterprise; especially in the development and deployment of technology solutions. Above all, not forgetting though that the people issue, which we did not touch on here, will be the weakest link. Education, awareness, and cultural changes will be needed to help enterprises more effectively protect themselves against the rise of cyber attacks.
Steven P. Lee, Global Client Consulting
Hidden in plain sight:
What really caused the world’s worst financial crisis — and why it could happen again
Peter J. Wallison, AEI’s Arthur F. Burns Fellow in Financial Policy Studies.
Because of the government’s extraordinary role in bringing on the crisis, it should not be treated as an inherent part of a capitalist or free market system, or used as a pretext for greater government control of the financial system. On the contrary, understanding the financial crisis for what it was will permit the debate we should have had about the Dodd-Frank Act.
Note: The following is the preface of Peter J. Wallison’s book “Hidden in Plain Sight: What really caused the world’s worst financial crisis — and why it could happen again,” published in January, 2015.
Far from being a failure of free market capitalism, the Depression was a failure of government. Unfortunately, that failure did not end with the Great Depression. . . . In practice, just as during the Depression, far from promoting stability, the government has itself been the major single source of instability. — Milton Friedman
Political contests often force the crystallization of answers to difficult political issues, and so it was with the question of responsibility for the financial crisis in the 2008 presidential election. In their second 2008 presidential debate, almost three weeks after Lehman Brothers had filed for bankruptcy, John McCain and Barack Obama laid out sharply divergent views of the causes of the financial convulsion that was then dominating the public’s concerns. The debate was in a town-hall format, and a member of the audience named Oliver Clark asked a question that was undoubtedly on the mind of every viewer that night:
Clark: Well, senators, through this economic crisis, most of the people that I know have had a difficult time. . . . I was wondering what it is that’s going to actually help these people out?
Senator McCain: Well, thank you, Oliver, that’s an excellent question. . . . But you know, one of the real catalysts, really the match that lit this fire, was Fannie Mae and Freddie Mac . . . they’re the ones that, with the encouragement of Sen. Obama and his cronies and his friends in Washington, that went out and made all these risky loans, gave them to people who could never afford to pay back . . .
Then it was Obama’s turn.
Senator Obama: Let’s, first of all, understand that the biggest problem in this whole process was the deregulation of the financial system. . . . Senator McCain, as recently as March, bragged about the fact that he is a deregulator. . . . A year ago, I went to Wall Street and said we’ve got to reregulate, and nothing happened. And Senator McCain during that period said that we should keep on deregulating because that’s how the free enterprise system works.
Although neither candidate answered the question that Clark had asked, their exchange, with remarkable economy, effectively framed the issues both in 2008 and today: was the financial crisis the result of government action, as McCain contended, or of insufficient regulation, as Obama claimed?
Since this debate, the stage has belonged to Obama and the Democrats, who gained control of the presidency and Congress in 2008, and their narrative about the causes of the financial crisis was adopted by the media and embedded in the popular mind. Dozens of books, television documentaries, and films have told the easy story of greed on Wall Street or excessive and uncontrolled risk-taking by the private sector — the expected result of what the media has caricatured as “laissez-faire capitalism.” To the extent that government has been blamed for the crisis, it has been for failing to halt the abuses of the private sector.
The inevitable outcome of this perspective was the Dodd-Frank Wall Street Reform and Consumer Protection Act, by far the most costly and restrictive regulatory legislation since the New Deal. Its regulatory controls and the uncertainties they engendered helped produce the slowest post-recession US recovery in modern history. Figure 1 compares the recovery of gross domestic product (GDP) per capita since the recession ended in June 2009 with the recoveries following recessions since 1960.
Unfortunately, Dodd-Frank may provide a glimpse of the future. As long as the financial crisis is seen in this light — as the result of insufficient regulation of the private sector — there will be no end to the pressure from the left for further and more stringent regulation. Proposals to break up the largest banks, reinstate Glass-Steagall in its original form, and resume government support for subprime mortgage loans are circulating in Congress. These ideas are likely to find public support as long as the prevailing view of the financial crisis is that it was caused by the risk-taking and greed of the private sector.
For that reason, the question of what caused the financial crisis is still very relevant today. If the crisis were the result of government policies, the Dodd-Frank Act was an illegitimate response to the crisis and many of its unnecessary and damaging restrictions should be repealed. Similarly, proposals and regulations based on a false narrative about the causes of the financial crisis should also be seen as misplaced and unfounded.
Sources: Bureau of Economic Analysis; Census Bureau; authors’ calculations. Adapted from Tyler Atkinson, David Luttrell, and Harvey Rosenblum, “How Bad Was It? The Costs and Consequences of the 2007–09 Financial Crisis,” Staff Papers (Federal Reserve Bank of Dallas) no. 20 (July 2013): 4.
Note: The gray area indicates the range of major recessions since 1960, excluding the short 1980 recession.
As demonstrated by Dodd-Frank itself, first impressions are never a sound basis for policy action, and haste in passing significant legislation can have painful consequences. During the Depression era, it was widely believed that the extreme level of unemployment was caused by excessive competition. This, it was thought, drove down prices and wages and forced companies out of business, causing the loss of jobs. Accordingly, some of the most far-reaching and hastily adopted legislation — such as the National Industrial Recovery Act and the Agricultural Adjustment Act (both ultimately declared unconstitutional) — was designed to protect competitors from price competition. Raising prices in the midst of a depression seems wildly misguided now, but it was a result of a mistaken view about what caused the high levels of unemployment that characterized the era.
In the 1960s, Milton Friedman and Anna Schwartz produced a compelling argument that the Great Depression was an ordinary cyclical downturn that was unduly prolonged by the mistaken monetary policies of the Federal Reserve. Their view and the evidence that they adduced gradually gained traction among economists and policy makers. Freed of its association with unemployment and depression, competition came to be seen as a benefit to consumers and a source of innovation and economic growth rather than a threat to jobs.
With that intellectual backing, a gradual process of reducing government regulation began in the Carter administration. Air travel, trucking, rail, and securities trading were all deregulated, followed later by energy and telecommunications. We owe cell phones and the Internet to the deregulation of telecommunications, and a stock market in which billions of shares are traded every day — at a cost of a penny a share — to the deregulation of securities trading. Because of the huge reductions in cost brought about by competition, families don’t think twice about making plane reservations for visits to Grandma, and we take it for granted that an item we bought over the Internet will be delivered to us, often free of a separate charge, the next day. These are the indirect benefits of a revised theory for the causes of the Depression that freed us to see the benefits of competition.
We have not yet had this epiphany about the financial crisis, but the elements for it — as readers will see in this book — have been hidden in plain sight. Accordingly, what follows is intended to be an entry in a political debate — a debate that was framed in the 2008 presidential contest but never actually joined. In the following pages, I argue that but for the housing policies of the US government during the Clinton and George W. Bush administrations, there would not have been a financial crisis in 2008. Moreover, because of the government’s extraordinary role in bringing on the crisis, it is invalid to treat it as an inherent part of a capitalist or free market system, or to use it as a pretext for greater government control of the financial system.
I do not absolve the private sector, although that will be the claim of some, but put the errors of the private sector in the context of the government policies that dominated the housing finance market for the 15 years before the crisis, including the government regulations that induced banks to load up on assets that ultimately made them appear unstable or insolvent. I hope readers will find the data I have assembled informative and compelling. The future of the housing finance system and the health of the wider economy depend on a public that is fully informed about the causes of the 2008 financial crisis.
– By Peter J. Wallison
AEI’s Arthur F. Burns Fellow in Financial Policy Studies.
This article is excerpted from “Hidden in Plain Sight: What really caused the world’s worst financial crisis — and why it could happen again,” published by Encounter Books.
BOARD GOVERNANCE SERIES:
“Oversight and Management of Cyber Threats
and Security Concerns”
Synopsis:
We are living in an exciting times: digital technologies, mobile devices, cloud computing, big data analytics, and social media have brought about immense benefits and opportunities to corporations as well as individuals; however, there is also a “dark side” to these exciting technology developments, such as snooping, theft, interference, disruption and even destruction. This event will discuss the significant leadership challenges that senior executives and board members need to exercise to manage these emerging risks for their organizations.
Cybersecurity threats are no longer trivial, and significant financial loss, business disruption and negative publicity to the reputation of a corporation are occurring across the globe at an alarming rate. Regulators and consumers are rightfully concerned. Lasting damage can critically destroy significant value built over the years, including reputational harm, litigation, and potential liability for failing to implement adequate steps to protect the company and its’ customers. More recently, we have seen the emergence of derivative lawsuits brought against companies, their executives, and Directors experiencing an attack.
Best practice corporate governance dictates that the Board of Directors is responsible for the management of all risks within a company and accountable for setting the “tone at the top” or risk appetite. Accordingly, Directors need to start asking critical questions: whether their company is appropriately assessing cyber-risk issues and devoting adequate resources to prevent attacks. Directors are not expected to be experts in cyber-risk and cybersecurity; however, they are responsible for ensuring the necessary framework is in place. So what are the challenges senior executives and directors face in fulfilling these responsibilities?
Join us as we invite key representatives from the industry, current and former regulators, bankers, academia, consultants to discuss and share insights on this important topic of governance of corporate cyber threats and security concerns.
The Financial Risk Institute (FinRisk) London, will be having a special event, a debate in conjunction with the Bank Treasury Risk Management (BTRM) on Thursday, June 4, 2015 @ 1730 GMT.
The debate will be on topical bank balance sheet risk management issues of the day. This event will be live streamed and FinRisk participants globally, especially in the US are welcomed to join.
Event and registration details will be sent out shortly. Please mark your calendar and join us on this special event.