description
stringlengths 27
553
| text
stringlengths 0
341k
⌀ | bank
stringclasses 118
values | Year
int64 2k
2.03k
| Month
int64 1
12
|
---|---|---|---|---|
Remarks by Chairman of the Board of Governors of the US Federal Reserve System, Mr. Alan Greenspan, at a symposium sponsored by the Federal Reserve Bank of Kansas City in Jackson Hole, Wyoming on 28/8/98.
|
Mr. Greenspan remarks on income inequality Remarks by Chairman of the Board of Governors of the US Federal Reserve System, Mr. Alan Greenspan, at a symposium sponsored by the Federal Reserve Bank of Kansas City in Jackson Hole, Wyoming on 28/8/98. Income Inequality: Issues and Policy Options I am pleased once again to open this annual symposium. At the outset, I wish to thank Tom Hoenig and his staff for assembling a highly capable group of experts to inform us and to stimulate discussion on an important issue in the world economy. The study of income inequality - its causes, its consequences, and its potential policy implications - has a long history in economics, although it has not always had a high profile among researchers and policymakers. To borrow a phrase from Professor Atkinson, income distribution in recent years has been “brought in from the cold.” In part, that awareness has resulted from the experience of many industrialized economies with widening earnings inequality in the 1980s and 1990s. It has been heightened by interest in the consequences of economic change in developing, newly industrialized, and transition economies. The experience of industrialized countries, including the United States, with growing income inequality has spawned a great deal of research on the functioning of labor markets, on the sources of shifts in the demand for various types of skills, on the supply responses of workers, and on the efficacy of government efforts to intervene in the operation of labor markets. A number of those who have contributed importantly to this work will be participating in this conference. One story that has emerged from that body of research is now familiar: Rising demand for those workers who have the skills to effectively harness new technologies has been outpacing supply, and, thus, the compensation of those workers has been increasing more rapidly than for the lesser skilled segment of the workforce. That this supply-demand gap has been an important source of widening earnings inequality is now widely accepted within the economics profession. However, the considerable diversity of experiences across countries as well as the finding that earnings inequality has also increased within groups of workers with similar measured skills and experience suggest that we may need to look deeper than skill-biased technological change if we are to fully understand widening wage dispersion. In particular, how have private and public institutions influenced inequality over the past two decades? What roles have been played by growing international trade and the evolving ways in which production is organized? Again, the participants in this symposium are well-equipped to speak to these issues, and we should learn much more about the causes of widening inequality during the next two days. In discussing the extent to which large portions of the population are not reaping the benefits of economic growth, I hope that the participants at this conference will not stop with an analysis of trends in earnings - or, for that matter, even trends in income more broadly defined. Ultimately, we are interested in the question of relative standards of living and economic well-being. Thus, we need also to examine trends in the distribution of wealth, which, more fundamentally than earnings or income, represents a measure of the ability of households to consume. And we will even want to consider the distribution of consumption, which likely has the advantage of smoothing through transitory shocks affecting particular individuals or households for just a year or two. Among these more comprehensive measures, data for the United States from the Federal Reserve’s Survey of Consumer Finances suggest that inequality in household wealth - that is, in net worth - was somewhat higher in 1989 than at the time of our earlier survey in 1963. Subsequently, the 1992 and 1995 surveys - and here our data are statistically more comparable from survey to survey than they were earlier - showed that wealth inequality remained little changed in terms of the broad measures.1 Nonetheless, that stability masks considerable churning among the subgroups. One particularly notable change was an apparent rise in the share of wealth held by the wealthiest families at the expense of other wealthy families; most of the change occurred within the top 10 percent of the distribution. Moreover, our research using the survey suggests that conclusions about the distribution of wealth are sensitive - although to a lesser degree than income - to the state of the economy and to -2institutional arrangements for saving. For instance, among the wealthiest ½ percent of households, business assets, which tend to be quite cyclical, are particularly important. At the other end of the distribution, owned principal residences, the values of which are not as sensitive to business cycle conditions, are a typical household’s most important asset. Another interesting finding is that if we expand the definition of wealth to include estimates of Social Security and pension wealth, the distribution among U.S. households becomes much more even.2 This finding suggests that, in addition to factors influencing private wealth accumulation, the evolution of institutional arrangements for saving that has taken place over the last two decades may have played an important role in affecting changes in the distribution of wealth over time. What about the effect of the recent rise in stock and bond market values? The typical view is that the growth in mutual funds and other financial investment avenues has allowed individuals further down in the wealth distribution to take advantage of the strength in equity markets. Certainly, our figures show that households lower in the income distribution are now more likely to own stocks than a decade ago.3 However, between the 1992 and 1995 surveys, the spread of stock ownership and the rise in prices did not lead to a rise in the share of stock and mutual fund assets owned by the bottom 90 percent of the wealth distribution. Although their dollar holdings rose rapidly, the increases were not as large as those for households at the top of the wealth distribution. If patterns of equity ownership have not changed much since 1995, the steep rise in stock prices over the past several years would suggest a further increase in the concentration of net worth. This influence could be offset, to some extent, by a continued broadening in the ownership of equities, particularly through tax-deferred savings accounts. Moreover, some additional offset may have occurred through rising house prices, an important asset of middle class families. Our 1998 survey, which is now in the field, will yield a clearer reading both on how wealth concentration has changed and on the relative importance of different assets in that change. Despite our best efforts to measure trends in income and wealth, I believe that even those measures - by themselves - cannot yield a complete answer to the question of trends in material or economic well-being. In the United States, we observe a noticeable difference between trends in the dispersion of holdings of claims to goods and services - that is, income and wealth - and trends in the dispersion of actual consumption, the bottom-line determinant of material well-being. Ultimately, we are interested in whether households have the means to meet their needs for goods and for services, including those such as education and medical care, that build and maintain human capital. Using data from the Consumer Expenditure Survey that the U.S. Bureau of Labor Statistics conducts, researchers have found that inequality in consumption, when measured by current outlays, is less than inequality in income.4 These findings are not surprising. As is well known, consumers tend to maintain their levels of consumption in the face of temporary fluctuations in income. Variations in asset holdings and debt typically act as buffers to changes in income. Thus, consumption patterns tend to look more like patterns in income that have been averaged over several years - a finding that should remind us of the pitfalls of reading too much into any year-to-year change in our measures of economic well-being. The BLS’s consumer expenditure data suggest a rise in inequality over the 1980s comparable to that shown by the Census family income figures. However, during the first half of the 1990s, inequality partially receded for consumer expenditures while for income it continued to rise (table 1). The consumption data used in these calculations include only what individuals purchase directly out of their incomes and accumulated savings. Recently, researchers have extended the analysis using a more complete and more theoretically appealing measure of consumption that includes the indirect flow of services from the stock of durable goods that they already own - houses, vehicles, and major appliances.5 As one might expect, although this measure of consumption has a profile somewhat similar to that seen in the current expenditure data over the 1980s and the first half of the 1990s, it shows still lower levels of inequality overall and a clearer pattern of consumption smoothing during the 1981-83 recession. The information available from the Consumer Expenditure Survey can be used to calculate another interesting measure of the well-being of households: changes in inequality in the ownership of consumer durables. The BLS staff has updated tabulations of these data that they prepared -3for me several years ago (table 2). Of course, ownership rates for household durables clearly rise with income. But for a number of goods - for example, dishwashers, clothes dryers, microwaves, and motor vehicles - the distribution of ownership rates by income decile has become more equal over time. Even though we may be able to develop an array of measures of current and past trends in inequality - such as those that I have described and potentially others that may be presented at this symposium - we will likely still face considerable uncertainty about how to interpret those measures and about what the future may hold for the trend and the distribution of economic well-being. Wealth has always been created, virtually by definition, when individuals use their growing knowledge, in conjunction with an expanding capital stock, to produce goods and services of value. The process of wealth creation in the United States has evolved in a number of important ways. Over the last century, we have learned how to be more efficient in meeting the needs of consumers, and thus we have moved from producing essentials to the production of more discretionary goods and services. Moreover, these goods and services have been, over time, increasingly less constrained by the limits of physical bulk. More recently, we have found ways to unbundle the particular characteristics of each good and service to maximize its value to each individual. That striving to expand the options for satisfying the particular needs of individuals has resulted in a shift toward value created through the exploitation of ideas and concepts and away from the more straightforward utilization of physical resources and manual labor. The new thrust has led to structural changes in the way that we organize the production and the distribution of goods and services. It has increased the demand for, and the compensation of, workers who have the ability to create, analyze, and transform information and to interact effectively with others. Most important, it has accorded particularly high value to the application of advanced computer and telecommunications technologies to the generation of economic wealth. At the same time, however, the consequences of technological advances and their implications for the creation of wealth have become increasingly unpredictable. We have found that we cannot forecast with any precision which particular technology or synergies of technologies will add significantly to our knowledge and to our ability to gain from that knowledge. Even if future technological change were to occur at a steady rate, variations in our capacity to absorb and apply advances would likely lead to an uneven rate of increase - over time and across individuals - in returns to expanded investment in knowledge: Supplies of appropriately skilled workers can vary. In some cases, the initial choices in the exploitation of advances may turn out to be sub-optimal. In other cases, the full potential of advances may be realized only after extensive improvements or after complementary innovations in other fields of science. As we consider the causes and consequences of inequality, we should also be mindful that, over time, the relationship of economic growth, increases in standards of living, and the distribution of wealth has evolved differently in various political and institutional settings. Thus, generalizations about the past and the future may be hard to make, particularly in the current dynamic and uncertain environment of economic change. We need to ask, for example, whether we should be concerned with the degree of income inequality if all groups are experiencing relatively rapid gains in their real incomes, though those rates of gain may differ. And, we cannot ignore what is happening to the level of average income while looking at trends in the distribution. In this regard, our goal as central bankers should be clear: We must pursue monetary conditions in which stable prices contribute to maximizing sustainable long-run growth. Such disciplined policies will offer the best underpinnings for identifying opportunities to channel growing knowledge, innovation, and capital investment into the creation of wealth that, in turn, will lift living standards as broadly as possible. Moreover, as evidenced by this symposium, sustaining a healthy economy and a stable financial system naturally permits us to take the time to focus efforts on addressing the distributional issues facing our society and on other challenging issues that may remain out in the cold. Table 1 “GINI COEFFICIENTS” FOR U.S. CONSUMER EXPENDITURES AND INCOME Consumption .290 .285 .302 .305 .312 .319 .327 .328 .323 .325 .328 .320 .329 .320 .318 .317 n.a. Income .365 .369 .380 .382 .383 .389 .392 .393 .395 .401 .396 .397 .404 .429 .426 .421 .425 Source: Consumer expenditure data are from the Consumer Expenditure Survey, U.S. Bureau of Labor Statistics. Income data are for families as of March of the following year from the Current Population Survey, U.S. Census Bureau. Table 2 “GINI COEFFICIENTS” FOR OWNERSHIP RATES OF SELECTED CONSUMER DURABLES (By income decile) Microwave ovens .28 .07 Dishwashers .29 .23 Clothes dryers .17 .12 Garbage disposals .26 .21 Motor vehicles .09 .07 Freezers .06 .07 Clothes washers .08 .09 Refrigerators .01 .01 Stoves .01 .01 Source: Based on tabulations from the Consumer Expenditure Survey, U.S. Bureau of Labor Statistics. Note: The Gini coefficient is defined as one minus twice the area under the cumulative probability distribution (CPD). The Ginis computed here do not have the properties of a “true” Gini coefficient. For example, a true Gini must lie between zero and one. The Ginis calculated here could be negative if low-income individuals had a higher ownership rate than high- income individuals. Using percent ownership. The percent ownership rates by decile are transformed into a discrete probability distribution. The formula is: pi = ri /3ri the sum is over i = 1 to 10 where pi is the fraction of all households that own the durable good who are in income decile i or ri is the actual ownership rate for the ith decile. By construction, the sum of the pi=s is equal to one. For goods that have ownership rates that are relatively equal across deciles, regardless of the level of the ownership rate, the probability distributions are fairly flat with values for pi close to 0.1. For goods that are more concentrated among the affluent households, the probability distributions tend to rise across the income deciles. -5* * * Footnotes 1 Arthur B. Kennickell and R. Louise Woodburn, “Consistent Weight Design for the 1989, 1992 and 1995 SCFs, and the Distribution of Wealth,” manuscript, August 1997. 2 Arthur B. Kennickell and Annika E. Sunden, “Pensions, Social Security, and the Distribution of Wealth,” Finance and Economics Discussion Series, 1997-55, Board of Governors of the Federal Reserve System, November 1997. 3 Martha Starr-McClure, “Stock Market Wealth and Consumer Spending,” Finance and Economics Discussion Series, 1998-20, Board of Governors of the Federal Reserve System, April 1998. 4 These results were originally reported in Report on the American Workforce, U.S. Department of Labor, 1995 and will appear in David S. Johnson and Stephanie Shipp, “ Inequality and the Business Cycle: A Consumption Viewpoint,” forthcoming, Journal of Empirical Economics. David S. Johnson of the U.S. Bureau of Labor Statistics provided the updated data shown in table 1. 5 David S. Johnson and Timothy M. Smeeding, “Measuring Trends in Inequality and Individuals and Families: Income and Consumption,” mimeo., March 1998.
|
board of governors of the federal reserve system
| 1,998 | 9 |
Remarks by the Chairman of the Board of Governors of the US Federal Reserve System, Mr. Alan Greenspan, at the Haas Annual Business Faculty Research Dialogue, University of California, Berkeley, California on 4/9/98.
|
Mr. Greenspan considers whether there has been a profound and fundamental alteration in the way the economy works in the United States Remarks by the Chairman of the Board of Governors of the US Federal Reserve System, Mr. Alan Greenspan, at the Haas Annual Business Faculty Research Dialogue, University of California, Berkeley, California on 4/9/98. Question: Is There a New Economy? The question posed for this lecture of whether there is a new economy reaches beyond the obvious: Our economy, of course, is changing everyday, and in that sense it is always “new”. The deeper question is whether there has been a profound and fundamental alteration in the way our economy works that creates discontinuity from the past and promises a significantly higher path of growth than we have experienced in recent decades. The question has arisen because the economic performance of the United States in the past five years has in certain respects been unprecedented. Contrary to conventional wisdom and the detailed historic economic modelling on which it is based, it is most unusual for inflation to be falling this far into a business expansion. Many of the imbalances observed during the few times in the past that a business expansion has lasted more than seven years are largely absent today. To be sure, labor markets are unusually tight, and we should remain concerned that pressures in these markets could spill over to costs and prices. But, to date, they have not. Moreover, it is just not credible that the United States can remain an oasis of prosperity unaffected by a world that is experiencing greatly increased stress. Developments overseas have contributed to holding down prices and aggregate demand in the United States in the face of strong domestic spending. As dislocations abroad mount, feeding back on our financial markets, restraint is likely to intensify. In the spring and early summer, the Federal Open Market Committee was concerned that a rise in inflation was the primary threat to the continued expansion of the economy. By the time of the Committee’s August meeting, the risks had become balanced, and the Committee will need to consider carefully the potential ramifications of ongoing developments since that meeting. Some of those who advocate a “new economy” attribute it generally to technological innovations and breakthroughs in globalization that raise productivity and proffer new capacity on demand and that have, accordingly, removed pricing power from the world’s producers on a more lasting basis. There is, clearly, an element of truth in this proposition. In the United States, for example, a technologically driven decline is evident in the average lead times on the purchase of new capital equipment that has kept capacity utilization at moderate levels and virtually eliminated most of the goods shortages and bottlenecks that were prevalent in earlier periods of sustained strong economic growth. But, although there doubtless have been profound changes in the way we organize our capital facilities, engage in just-in-time inventory regimes, and intertwine our newly sophisticated risk-sensitive financial system into this process, there is one important caveat to the notion that we live in a new economy, and that is human psychology. The same enthusiasms and fears that gripped our forebears, are, in every way, visible in the generations now actively participating in the American economy. Human actions are always rooted in a forecast of the consequences of those actions. When the future becomes sufficiently clouded, people eschew actions and disengage from previous commitments. To be sure, the degree of risk aversion differs from person to person, but judging the way prices behave in today’s markets compared with those of a century or more ago, one is hard pressed to find significant differences. The way we evaluate assets, and the way changes in those values affect our economy, do not appear to be coming out of a set of rules that is different from the one that governed the actions of our forebears. Hence, as the first cut at the question “Is there a new economy?” the answer in a more profound sense is no. As in the past, our advanced economy is primarily driven by how human psychology moulds the value system that drives a competitive market economy. And that process is inextricably linked to human nature, which appears essentially immutable and, thus, anchors the future to the past. But having said that, important technological changes have been emerging in recent years that are altering, in ways with few precedents, the manner in which we organize production, trade across countries, and deliver value to consumers. To explore the significance of those changes and their relevance to the possibility of a “new economy”, we need to first detail some key features of our system. The American economy, like all advanced capitalist economies, is continually in the process of what Joseph Schumpeter, a number of decades ago, called “creative destruction”. Capital equipment, production processes, financial and labor market infrastructure, and the whole panoply of private institutions that make up a market economy are always in a state of flux -- in almost all cases evolving into more efficient regimes. The capital stock -- the plant and equipment that facilitates our production of goods and services -- can be viewed, with only a little exaggeration, as continuously being torn down and rebuilt. Our capital stock and the level of skills of our workforce are effectively being upgraded as competition presses business managements to find increasingly innovative and efficient ways to meet the ever-rising demands of consumers for quantity, quality, and variety. Supply and demand have been interacting over the generations in a competitive environment to propel standards of living higher. Indeed, this is the process that, in fits and starts, has characterized our and other market economies since the beginning of the Industrial Revolution. Earlier, standards of living barely changed from one generation to the next. This is the tautological sense in which every evolving market economy, our own included, is always, in some sense, “new”, as we struggle to increase standards of living. In the early part of the 19th century, the United States, as a developing country, borrowed much technology and savings from Europe to get a toehold on the growth ladder. But over the past century, America has moved to the cutting edge of technology. There is no question that events are continually altering the shape and nature of our economic processes, especially the extent to which technological breakthroughs have advanced and perhaps, most recently, even accelerated the pace of conceptualization of our gross domestic product. We have dramatically reduced the size of our radios, for example, by substituting transistors for vacuum tubes. Thin fiber-optic cable has replaced huge tonnages of copper wire. New architectural, engineering, and materials technologies have enabled the construction of buildings enclosing the same space but with far less physical material than was required, say, 50 or 100 years ago. Most recently, mobile phones have been markedly downsized as they have been improved. As a consequence, the physical weight of our GDP is growing only very gradually. The exploitation of new concepts accounts for virtually all of the inflation-adjusted growth in output. The cause of this dramatic shift toward product downsizing during the past half century can only be surmised. Perhaps the physical limitations of accumulating goods and moving them in an ever more crowded geographical environment resulted in cost pressures to economize on size and space. Similarly, perhaps it was the prospect of increasing costs of processing ever larger quantities of physical resources that shifted producers toward downsized alternatives. Remember, it was less than three decades ago that the Club of Rome issued its dire warnings about the prospects of running out of the physical resources that allegedly were necessary to support our standards of living. Finally, as we moved the technological frontier forward and pressed for information processing to speed up, for example, the laws of physics required the relevant microchips to become ever more compact. But what was always true in the past, and will remain so in the future, is that the output of a free market economy and the notion of wealth creation will reflect the value preferences of people. Indeed, the very concept of wealth has no meaning other than as a reflection of human value preferences. There is no intrinsic value in wheat, a machine, or a software program. It is only as these products satisfy human needs currently, or are perceived to be able do so in the future, that they are valued. And it is such value preferences, as they express themselves in the market’s key signals -- product and asset prices -- that inform producers of what is considered valuable and, together with the state of technology, what could be profitably produced. To get back to basics, the value of any physical production facility depends on the perceived value of the goods and services that the facility is projected to produce. More formally, the current value of the facility can be viewed as the sum of the discounted value of all future outputs, net of costs. An identical physical facility with the same capacity to produce can have different values in the marketplace at different times, depending on the degree to which the investing public feels confident about the ability of the firm to perceive and respond to the future environments in which the plant will be turning out goods and services. The value of a steel mill, which has an unchanging ability to turn out sheet steel, for example, can vary widely in the marketplace depending on the level of interest rates, the overall rate of inflation, and a number of other factors that have nothing to do with the engineering aspects of the production of steel. What matters is how investors view the markets into which the steel from the mill is expected to be sold over the years ahead. When that degree of confidence in judging the future is high, discounted future values also are high -- and so are the prices of equities, which, of course, are the claims on our productive assets. The forces that shape the degree of confidence are largely endogenous to an economic process that is generally self-correcting as consumers and investors interact with a continually changing market reality. I do not claim that all market behavior is a rational response to changes in the real world. But most of it must be. For, were it otherwise, the relatively stable economic environments that have been evident among the major industrial countries over the generations would not be possible. Certainly, the degree of confidence that future outcomes are perceivable and projectable, and hence valued, depends in large part on the underlying political stability of any country with a market-oriented economy. Unless market participants are assured that their future commitments and contracts are protected by a rule of law, such commitments will not be made; productive efforts will be focused to address only the immediate short-term imperatives of survival; and efforts to build an infrastructure to provide for future needs will be stunted. A society that protects claims to long-lived productive assets thereby surely encourages their development. That spurs levels of production to go beyond the needs of the moment, the needs of immediate consumption, because claims on future production values will be discounted far less than in an environment of political instability and, for example, a weak law of contracts. At that point, the makings of a sophisticated economy based on longer-term commitments are in place. It will be an economy that saves and invests -- that is, commits to the future -- and, hence, one that will grow. But every competitive market economy, even one solidly based on a rule of law, is always in a state of flux, and its perceived productiveness is always subject to degrees of uncertainty that are inevitably associated with endeavors to anticipate future outcomes. Thus, while the general state of confidence and consumers’ and investors’ willingness to commit to long-term investment is buttressed by the perceptions of the stability of the society and economy, history demonstrates that that degree of confidence is subject to wide variations. Most of those variations are the result of the sheer difficulty in making judgments and, therefore, commitments about, and to, the future. On occasion, this very difficulty leads to less-disciplined evaluations, which foster price volatility and, in some cases, what we term market bubbles -- that is, asset values inflated more on the expectation that others will pay higher prices than on a knowledgeable judgment of true value. The behavior of market economies across the globe in recent years, especially in Asia and the United States, has underscored how large a role expectations have come to play in real economic development. Economists use the term “time preference” to identify the broader tradeoff that individuals are willing to make, even without concern for risk, between current consumption and claims to future consumption. Measurable discount factors are intended to capture in addition the various types of uncertainties that inevitably cloud the future. Dramatic changes in the latter underscore how human evaluation, interacting with the more palpable changes in real output, can have profound effects on an economy, as the experiences in Asia have so amply demonstrated during the past year. Vicious cycles have arisen across South-East Asia with virtually no notice. At one point, an economy would appear to be struggling, but no more than had been the case many times in the past. The next moment, market prices and the economy appeared in free fall. Our experiences with these vicious cycles in Asia emphasize the key role in a market economy of a critical human attribute: confidence or trust in the functioning of a market system. Implicitly, we engage in a division of labor because we trust that others will produce and be willing to trade the goods and services we do not produce ourselves. We take for granted that contracts will be fulfilled in the normal course of business, relying on the rule of law, especially the law of contracts. But if trust evaporated and every contract had to be adjudicated, the division of labor would collapse. A key characteristic, perhaps the fundamental cause of a vicious cycle, is the loss of trust. We did not foresee such a breakdown in Asia. I suspect that the very nature of the process may make it virtually impossible to anticipate. It is like water pressing against a dam. Everything appears normal until a crack brings a deluge. The immediate cause of the breakdown was an evident pulling back from future commitments, arguably, the result of the emergence among international lenders of widening doubt that the dramatic growth evident among the Asian “tigers” could be sustained. The emergence of excess worldwide capacity in semiconductors, a valued export for the tigers, may have been among the precipitating events. In any case, the initial rise in market uncertainty led to a sharp rise in discounts on future claims to income and, accordingly, falling prices of real estate and equities. The process became self-feeding as disengagement from future commitments led to still greater disruption and uncertainty, rising risk premiums and discount factors, and a sharp fall in production. While the reverse phenomenon, a virtuous cycle, is not fully symmetrical, some part is. Indeed, much of the current American economic expansion is best understood in the context of favorable expectations, interacting with production and finance to expand rather than implode economic processes. The American economic stability of the past five years has helped engender increasing confidence of future stability. This, in turn, has dramatically upgraded the stock market’s valuation of our economy’s existing productive infrastructure, adding about $6 trillion of capital gains to household net worth from early 1995 through the second quarter of this year. While the vast majority of these gains augmented retirement and other savings programs, enough spilled over into consumer spending to significantly lower the proportion of household income that consumers, especially upper income consumers, believed it necessary to save. In addition, the longer the elevated level of stock prices was sustained, the more consumers likely viewed their capital gains as permanent increments to their net worth, and, hence, as spendable. The recent windfall financed not only higher personal consumption expenditures but home purchases as well. It is difficult to explain the recent record level of home sales without reference to earlier stock market gains. The rise in stock prices also meant a fall in the equity cost of capital that doubtless raised the pace of new capital investment. Investment in new facilities had already been given a major boost by the acceleration in technological developments, which evidently increased the potential for profit in recent years. The sharp surge in capital outlays during the past five years apparently reflected the availability of higher rates of return on a broad spectrum of potential investments owing to an acceleration in technological advances, especially in computer and telecommunications applications. This is the apparent root of the recent evident quickened pace of productivity advance. While the recent technological advances have patently added new and increasingly flexible capacity, the ability of these technologies to improve the efficiency of productive processes (an issue I will elaborate on shortly) has significantly reduced labor requirements per unit of output. This, no doubt, was one factor contributing to a dramatic increase in corporate downsizing and reported widespread layoffs in the early 1990s. The unemployment rate also began to fall as the pace of new hires to man the new facilities exceeded the pace of layoffs from the old. Parenthetically, the perception of increased churning of our workforce in the 1990s has understandably increased the sense of accelerated job-skill obsolescence among a significant segment of our workforce, especially among those most closely wedded to older technologies. The pressures are reflected in a major increase in on-the-job training and a dramatic expansion of college enrolment, especially at community colleges. As a result, the average age of full-time college students has risen dramatically in recent years as large numbers of experienced workers return to school for skill upgrading. But the sense of increasing skill obsolescence has also led to an apparent willingness on the part of employees to forgo wage and benefit increases for increased job security. Thus, despite the incredible tightness of labor markets, increases in compensation per hour have continued to be relatively modest. Coupled with the quickened pace of productivity growth, wage and benefit moderation has kept growth in unit labor costs subdued in the current expansion. This has both damped inflation and allowed profit margins to reach high levels. That, in turn, apparently was the driving force beginning in early 1995 in security analysts’ significant upward revision of their company-by-company long-term earnings projections. These upward revisions, coupled with falling interest rates, point to two key underlying forces that impelled investors to produce one of history’s most notable bull stock markets. But they are not the only forces. In addition, the sequence of greater capital investment, productivity growth, and falling inflation fostered an ever more benevolent sense of long-term stable growth. People were more confident about the future. The consequence was a dramatic shrinkage in the so-called equity premium over the past two years to near historic lows earlier this summer. The equity premium is the charge for the additional risks that markets require to hold stocks rather than riskless debt instruments. When perceived risks of the future are low, equity premiums are low and stock prices are even more elevated than would be indicated solely from higher expected long-term earnings growth and low riskless rates of interest. Thus, one key to the question of whether there is a new economy is whether current expectations of future stability, which are distinctly more positive than say a decade ago, are justified by actual changes in the economy. For if expectations of greater stability are borne out, risk and equity premiums will remain low. In that case, the cost of capital will also remain low, leading, at least for a time, to higher investment and faster economic growth. Two considerations are therefore critical to higher asset values and higher economic growth. The first is whether the apparent upward shift in technological advance will persist. The second is the extent of confidence in the stability of the future that consumers and investors will be able to sustain. With regard to the first: How fast can technology advance, augmenting the pool of investment opportunities that have elevated rates of return, which engender still further increases in expected long-term earnings? Technological breakthroughs, as history so amply demonstrates, are frustratingly difficult to discern much in advance. The particular synergies between new and older technologies are generally too complex to anticipate. An innovation’s full potential may be realized only after extensive improvements or after complementary innovations in other fields of science. According to Charles Townes, a Nobel Prize winner for his work on the laser, the attorneys for Bell Labs initially, in the late 1960s, refused to patent the laser because they believed it had no applications in the field of telecommunications. Only in the 1980s, after extensive improvements in fiber-optics technology, did the laser’s importance for telecommunications become apparent. The future of technology advance may be difficult to predict, but for the period ahead there is the possibility that already proven technologies may not as yet have been fully exploited. Company after company reports that, when confronted with cost increases in a competitive environment that precludes price increases, they are able to offset those costs, seemingly at will, by installing the newer technologies. Such stories seem odd. If cost improvements were available at will earlier, why weren’t the investments made earlier? This implies suboptimal business behavior, contrary to what universities teach in Economics 101. But in the real world, companies rarely fully maximize profits. They concentrate on only those segments of their businesses that appear to offer the largest rewards and are rarely able to operate at the most efficient frontier on all fronts simultaneously. When costs rise, the attention of management presumably becomes focused more sharply on investments to limit the effects of rising costs. But if cost-cutting at will is, in fact, currently available, it suggests that a backlog of unexploited capital projects has been built up in recent years, which, if true, implies the potential for continued gains in productivity close to the elevated rates of the last couple of years. Even if this is indeed the case, and only anecdotal evidence supports it, security analysts’ recent projected per share earnings growth of more than 13 percent annually over the next three to five years is unlikely to materialize. It would imply an ever-increasing share of profit in the national income from a level that is already high by historic standards. Such conditions have led in the past to labor market pressures that thwarted further profit growth. The second consideration with respect to how high asset values can rise is: How far can risk and equity premiums fall? A key factor is that price inflation has receded to quite low levels. The rising level of confidence in recent years concerning future outcomes has doubtless been related to the fall in the rate of inflation that has, of course, also been a critical factor in the fall in interest rates and, importantly, the fall in equity premiums as well. Presumably, the onset of deflation, should it occur, would increase uncertainty as much as a re-emergence of inflation concerns. Thus, arguably, at near price stability, perceived risk from business-cycle developments would be at its lowest, and one must presume that would be the case for equity premiums as well. In any event, there is a limit on how far investors can rationally favorably discount the future and therefore how low equity premiums can go. Current claims on a source of income available 20 or 30 years in the future still have current value. But should claims on the hereafter? An implication of high equity market values, relative to income and production, is an increased potential for instability. As I argued earlier, part of capital gains increases consumption and incomes. Since equity values are demonstrably more variable than incomes, when equity market values become large relative to incomes and GDP, their fluctuations can be expected to effect GDP more than when equity market values are low. Clearly, the history of large swings in investor confidence and equity premiums for rational and other reasons counsels caution in the current context. We have relearned in recent weeks that just as a bull stock market feels unending and secure as an economy and stock market move forward, so it can feel when markets contract that recovery is inconceivable. Both, of course, are wrong. But because of the difficulty imagining a turnabout when such emotions take hold, periods of euphoria or distress tend to feed on themselves. Indeed, if this were not the case, the types of psychologically driven ebbs and flows of economic activity we have observed would be unlikely to exist. Perhaps, as some argue, history will be less of a guide than it has been in the past. Some of the future is always without historical precedent. New records are always being made. Having said all that, however, my experience of observing the American economy day by day over the past half century suggests that most, perhaps substantially most, of the future can be expected to rest on a continuum from the past. Human nature, as I indicated earlier, appears immutable over the generations and inextricably ties our future to our past. Nonetheless, as I indicated earlier, I would not deny that there doubtless has been in recent years an underlying improvement in the functioning of America’s markets and in the pace of development of cutting edge technologies beyond previous expectations. Most impressive is the marked increase in the effectiveness in the 1990s of our capital stock, that is, our productive facilities, the issue to which I alluded earlier. While gross investment has been high, it has been, in recent years, composed to a significant extent of short-lived assets that depreciate rapidly. Thus, the growth of the net capital stock, despite its recent acceleration, remains well below the peak rates posted during the past half century. Despite the broadening in recent decades of international capital flows, empirical evidence suggests that domestic investment still depends to a critical extent on domestic saving, especially at the margin. Many have argued persuasively, myself included, that we save too little. The relatively low propensity to save on the part of the American public has put a large premium on the effective use of scarce capital, and on the winnowing out of the potentially least productive and, hence, the least profitable of investment opportunities. That is one of the reasons that our financial system, whose job it is to ensure the productive use of physical capital, has been such a crucial part of our overall economy, especially over the past two decades. It is the signals reflected in financial asset prices, interest rates, and risk spreads that have altered the structure of our output in recent decades towards a different view of what consumers judge as value. This has imparted a significant derived value to a financial system that can do that effectively and, despite recent retrenchments, to the stock market value of those individual institutions that make up that system. Clearly, our high financial returns on investment are a symptom that our physical capital is being allocated to produce products and services that consumers particularly value. A machining facility that turns out an inferior product or a toll road that leads to nowhere will not find favor with the public, will earn subnormal or negative profits, and in most instances will exhibit an inability over the life of the asset to recover the cash plus cost of capital invested in it. Thus, while adequate national saving is a necessary condition for capital investment and rising productivity and standards of living, it is by no means a sufficient condition. The former Soviet Union, for example, had too much investment, and without the discipline of market prices, they grossly misplaced it. The preferences of central planners wasted valuable resources by mandating investment in sectors of the economy where the output wasn’t wanted by consumers -- particularly in heavy manufacturing industries. It is thus no surprise that the Soviet Union’s capital/output ratios were higher than those of contemporaneous free market economies of the West. This phenomenon of overinvestment is observable even among more sophisticated free market economies. In Japan, the saving rate and gross investment have been far higher than ours, but their per capita growth potential appears to be falling relative to ours. It is arguable that their hobbled financial system is, at least in part, a contributor to their economy’s subnormal performance. We should not become complacent, however. To be sure, the sharp increases in the stock market have boosted household net worth. But while capital gains increase the value of existing assets, they do not directly create the resources needed for investment in new physical facilities. Only saving out of income can do that. In summary, whether over the past five to seven years, what has been, without question, one of the best economic performances in our history is a harbinger of a new economy or just a hyped-up version of the old, will be answered only with the inexorable passage of time. And I suspect our grandchildren, and theirs, will be periodically debating whether they are in a new economy.
|
board of governors of the federal reserve system
| 1,998 | 9 |
Remarks by Mr. Edward M. Gramlich, a member of the Board of Governors of the US Federal Reserve System, on 'World Capital Flows' at the Carnegie Bosch Institute, University Center at Carnegie Mellon, Pittsburgh, Pennsylvania on 15/9/98.
|
Mr. Gramlich reviews the benefits and problems of world capital flows Remarks by Mr. Edward M. Gramlich, a member of the Board of Governors of the US Federal Reserve System, on “World Capital Flows” at the Carnegie Bosch Institute, University Center at Carnegie Mellon, Pittsburgh, Pennsylvania on 15/9/98. For many years it has been clear that free trade is generally more efficient economically than protection. There are still huge political fights within countries about trade policies, and there may be occasions when countries are better off deviating from free trade, but free trade normally wins the economic high ground in most policy arguments. But that is not true, or not as true, for international capital flows. Large increases in world debt levels combined with inadequate management of foreign currency risks have led to a collapse of banking systems in country after country, massive changes in exchange rates, and a recession or depression in much of the world economy. In view of these provocations, many observers are now re-examining their preconceptions that free international capital flows are optimal. Particular countries such as Malaysia are clamping on exchange controls, free trade economists such as Paul Krugman are suggesting currency controls as a least bad option, a good many others are recommending distortionary taxes on some capital flows, and almost everybody is at least re-examining their preconceptions. But before this process goes too far, I would like to enter a contrary plea. There are clear problems with the present system of international lending and borrowing. But there are clear and important benefits as well. We know what the problems are. Let’s fix them as soon as possible, and try to preserve the important benefits of international capital flows. The Benefits of World Capital Flows Let me start with the benefits of world capital flows. Much of this is standard neoclassical economics, but it bears repeating in these troubled times. Countries’ saving and investment propensities can differ markedly around the world. If economies were closed to capital flows, domestic interest rates would equilibrate saving and investment in each country. These interest rates would vary widely, which means that capital would have a different marginal value in each country. Suppose now the world were to be opened up to capital flows. Capital would flow out of those countries with low interest rates, where investment prospects are scarce compared to domestic saving, and towards countries where interest rates are high, where investment prospects are abundant relative to domestic saving. Countries that import capital would benefit from a greater stock of high productivity investments. Countries that export capital would put their saving to better use. Both countries would gain, and the world economy would also gain because scarce saving is better allocated to capital markets around the world. This all may sound a bit academic, but the process has been very important in the world’s economic development. Countries undergoing development can sometimes squeeze the requisite saving out of their poor economies, as the United Kingdom did in the 19th century and Japan did in the 20th century. Or they can import capital to supplement domestic saving, as the United States, Canada, and Australia did in the 19th century and several countries in Eastern Europe and Asia have done in the 20th century. This imported capital has been very important in the world’s development process, and it has often brought with it other benefits, such as the international spreading of technological improvements. Without world capital flows, the general level of development in the world would be far less. Problems If that’s the story, what’s the problem? Why do world capital markets seem to be causing such anguish, in country after country? Why are countries turning away from open capital markets? It turns out that there is some fine print that goes with the neoclassical model, and the world economy has had some trouble with the fine print. Here are some of the ways. • Exchange rate volatility. Countries trade in different currencies, and exchange rates are necessary to make the conversions. But in addition to being the devices for converting currencies, exchange rates play another important role in the world economy - they also represent the mechanisms for bringing countries price structures in line in international trade. Suppose one country’s prices are too high for it to compete in world trade, either because it has suffered an adverse trade shock, has had too much inflation, has too rigid labor contracts, or for some other reason. The country can suffer a painful recession to get its prices back in line. Or it can let its currency depreciate. Depreciations are not absolutely necessary, as is asserted by advocates of fixed exchange rates. But depreciations, or currency fluctuations in general, are often far less costly ways to make international adjustments. The next time you hear someone complain about exchange rate changes, ask them what else they had in mind to restore international equilibrium. While exchange rate changes have this potentially stabilizing effect, they can also be destabilizing. When exchange rates are allowed to fluctuate, they are set in currency markets according to the demands and supplies of forward-looking traders, and they can be very volatile, often overshooting long-term values. This volatility can destabilize trade and impart significant uncertainty to international lending. Borrowers may borrow in international markets expecting to repay at one exchange rate, and when the bill comes due, the exchange rate may differ, by a lot. Lenders and borrowers can both hedge, but hedging is costly, and institutions providing this hedge can go broke. • Financial institutions. Financial institutions engage in what is known as maturity transformation. They take deposits from you and me, which deposits can be redeemed any time we take out our check book, and make long-term loans to mortgage borrowers and businesses. There are fantastic efficiencies in this process - consumers can get higher returns on their saving, and business can borrow and invest in productive equipment. Indeed, banks and other financial institutions play the same role in allocating domestic saving to its best uses that the international capital market plays for international capital. But as with capital flows, there are risks. If the banks’ loans go bad, the institution is in trouble and may not be able to redeem deposits. Within a country’s borders there are often institutions such as deposit insurance to protect savers, but internationally the risks can be greater. Given that banks are involved in the domestic saving, investment process, it is not surprising that they have become heavily involved in the international saving, investment process. In general this process too has led to worldwide efficiencies, but here too there are risks. In Japan, for example, the banks have been hurt by a massive decline in real estate and other values, which have put many nonperforming loans on their books. In Korea and Indonesia, the banks were hurt by the foreign currency exposure of their borrowers. • Incentive effects. Incentive effects are always important in economics, generally complicating the analysis of policy measures. Nowhere is this as true as with international capital flows. One incentive problem involves what is known as moral hazard. Suppose national governments do intervene to protect the safety and soundness of banks. That intervention might seem sensible, but it could inspire more risky lending by the banks, to let the government pick up the tab if things go bad. A second problem involves unilateral default on loan payments or currency obligations. Defaults would appear to benefit debtors, but in the long run these debtors will have trouble getting new credit. Or contagion effects. Even if one borrower is perfectly sound, if other borrowers like it go under, lenders will be reluctant to lend to similarly classified borrowers. Discussions of international capital flows abound with these incentive effects. • Transparency. A last issue involves transparency. For the international capital market to work well, lenders must know the risks and rewards. They must know the aggregate debt of a country, the exchange rate risks they are taking, and the on and off-balance sheet liabilities of all relevant banks. When they do not have a good picture of these risks, they can get into serious trouble. While the world financial crisis has hit with uneven force in various countries, and while local situations have differed, the interaction between these four elements has almost always been critical. Sometimes big changes in exchange rates precipitated a crisis, sometimes big changes in asset values put the banks in trouble, sometimes moral hazard issues generated unsound loans, and almost everywhere there was a lack of transparency. Once a crisis got going, unilateral defaults and contagion effects spread it to other countries. The result was a broadscale financial collapse, which then fed over to the real sector and caused bankruptcies, credit restraints, and recessions. These real income losses in turn magnified the financial problems. Solutions Are there any solutions to this mess? Many are calling for an end to open capital markets, as if the benefits are not worth the costs. Others are calling for distortionary taxes, to protect countries from too much openness. Should we go part or all of the way back to closed economies? I remain an optimist and I think not. There may be occasional instances where some restrictions are necessary to control extreme capital flows not justified by economic fundamentals. Moreover, the world capital system clearly is in need of repair. But I hope we can repair it and get back to a world where saving in whatever country goes to the country that needs it most, and that can pay the most to the saver. And to a world where international capital flows are an important vehicle for world economic development. Let me discuss some remedies that should help in that process. • Financial institutions. Two types of corrections are necessary. A first is tantamount to preventive medicine - what should be done when institutions are not in crisis? The basic need here is for better bank supervision. Supervision must be improved to insure that proper risk management techniques are put in place and followed. Previous practices where banks, or those to whom banks made loans, loaded up on short-term hard currency denominated liabilities must be curbed. Market discipline can also be used to control risk, by exposing financial institutions to as much market discipline as possible and by limiting moral hazard problems. But even with all the preventive medicine in the world, the patient sometimes gets sick, and banks in Japan and a number of other countries surely are. What is to be done then? Examination of a number of historical episodes suggests that there are three important elements to dealing with a banking collapse. First, the bank regulators must go through the balance sheets, determine the problem loans, take them off the banks’ balance sheets, sell them, and have the public take whatever losses were entailed. Second, the troubled institutions must be closed down with the management replaced and shareholders suffering losses. Third, the healthy institutions must then be recapitalized, preferably by new equity sales in capital markets. The sooner these steps are taken, the better. The longer they are not taken, the longer troubled banks will continue incurring losses, engaging in risky practices, threatening the safe institutions, generating contagion effects, and not performing the banks’ all-important maturity transformation function. Indeed, here is a way foreign capital flows can be part of the solution to the problem, because new foreign-owned banks that can perform this maturity transformation function can be invited to participate in credit-starved economies. • Maturity transformation. While financial institutions perform maturity transformation, there is no reason to unduly burden the system. Countries have often gotten into trouble when they have done too much maturity transformation - that is, borrowed from abroad at short maturities to finance long-term investment. Steps could be taken to limit such borrowing, either nationally or through bank supervision. Some economists have used this idea to propose taxes on short-term borrowing, though the tax rates necessary to do the job could be very high and it might be more effective to use orthodox principles of bank supervision. • Transparency. A further set of corrective mechanisms involves greater transparency. It must be clearer to lenders and borrowers alike what risks they are taking. This requires better reporting by financial institutions, and it also requires better aggregate statistics on the uncommitted foreign reserve balances of individual countries. Transparency alone cannot do the job of good bank regulation, but it is hard to imagine a solution to the international lending problems of the day without more transparent accounting. • The IMF. Last but not least I discuss the IMF. Frankly, my own feeling is that if an institution like the IMF did not exist, we would want to invent something like it. Here is an international institution that, at least in principle, can supervise the finances of different countries without the accusations of big brother that would come about if the United States alone tried to put itself in this role. Moreover, it can organize other lending countries to bring support to countries in need of liquidity assistance, as opposed to having the bill fall entirely on the United States. Supporting the existence of something like the IMF does not suggest agreement with its exact structure or with all of its policies. The present crisis indicates that international preventive medicine provisions are weak. It may be difficult for the IMF to perform this warning system role, but there is clearly a need for better warnings about the risks of lending to certain countries, to prevent currency vulnerabilities from building up as much as they did. The IMF’s “one size fits all” recommendations of fiscal austerity should also be re-examined, because in many Asian countries expansionary fiscal policies were clearly called for indeed, still are called for - while the IMF initially recommended the reverse. But the two biggest criticisms of the IMF involve the moral hazard issue and exchange rate flexibility. On moral hazard, the criticism is more or less inevitable. Any time there is any governmental attempt to increase safety and soundness, moral hazard questions can be raised. In the case of the IMF, if it is there to bail out countries in a liquidity crisis, countries will be more likely to end up in liquidity crises. The answer, it seems to me, is for the IMF to attach tight conditions on its lending, so that countries will certainly not look forward to getting into debtor status. In terms of underlying structural change, this is one of the few leverage points the international lending community has on borrowing countries, and it is important for the IMF to use its tool well. On exchange rate flexibility, the problem with the critics is that they are not saying the same thing. Some say the IMF is too quick to let the exchange rate fall, some say it is not quick enough. Both cannot be right at the same time. As a general rule, the exchange rate flexibility issue is a hard one and it is hard to make broad recommendations. Sometimes a country’s cost structure is clearly out of line, or its inflation rate is clearly too high, and exchange rate flexibility seems to be the least costly way to bring prices and wages back into line. But exchange rate flexibility will cause capital losses and will raise the cost and risk of future capital transactions. That does not mean that the exchange rate must be preserved at all costs, but it does mean that capital flows themselves put some constraints on the normal international adjustment mechanisms. Given all this, it is hard for me to take a firm policy on exchange flexibility, and I have some sympathy for the apparent open-mindedness of the IMF on the issue. Will these solutions be enough? Who knows. But I think we all have a stake in making the world capital system work better. Before we initiate more artificial restrictions on capital flows, we should try to improve the system to take advantage of its very large benefits.
|
board of governors of the federal reserve system
| 1,998 | 9 |
Testimony of the Chairman of the Board of Governors of the US Federal Reserve System, Mr. Alan Greenspan, before the Committee on Banking and Financial Services of the US House of Representatives on 16/9/98.
|
Mr. Greenspan’s testimony on the international economic and financial system Testimony of the Chairman of the Board of Governors of the US Federal Reserve System, Mr. Alan Greenspan, before the Committee on Banking and Financial Services of the US House of Representatives on 16/9/98. As I testified before this Committee in the midst of the Mexican financial crisis in early 1995, major advances in technology have engendered a highly efficient and increasingly sophisticated international financial system. The system has fostered impressive growth in world trade and in standards of living for the vast majority of nations who have chosen to participate in it. But that same efficient financial system, as I also pointed out in that earlier testimony, has the capability to rapidly transmit the consequences of errors of judgment in private investments and public policies to all corners of the world at historically unprecedented speed. Thus, problems that appeared first in Thailand more than a year ago quickly spread to other East Asian economies that are relatively new participants in the international financial system, and subsequently to Russia and to some degree to eastern Europe and Latin America. Even long-time participants in the international financial community, such as Australia, New Zealand, and Canada, have experienced the peripheral gusts of the financial turmoil. Japan, still trying to come to grips with the bursting of its equity and real estate bubbles of the late 1980s, has experienced further setbacks as its major Asian customers have been forced to retrench. Reciprocally, its banking system problems and weakened economy have exacerbated the difficulties of its Asian neighbors. The relative stability of China and India, countries whose restrictions on international financial flows have insulated them to some extent from the current maelstrom, has led some to conclude that the relatively free flow of capital is detrimental to economic growth and standards of living. Such conclusions, in my judgment, are decidedly mistaken. The most affected emerging East Asian economies, despite the sharp contraction in their economic output during the past year, have retraced, on average, only one-sixth of their per capita growth over the past ten years. Even currently, their average per capita incomes are more than 2½ times the levels of India and China despite the unquestioned gains both have made in recent years as they too have moved partially to join the international financial community. Moreover, outside of Asia, several East European countries have made significant progress towards the adoption and implementation of market systems and have increasingly integrated their financial systems into the broader world context to the evident benefit of their populations. Latin American nations, though currently under pressure, have largely succeeded in opening up their economies to international financial flows, and more rapidly rising living standards have been the result. It is clear, nonetheless, that participation in the international financial system with all its benefits carries with it an obligation to maintain a level of stability and a set of strong and transparent institutions and rules if an economy is to participate safely and effectively in markets that have become highly sensitive to misallocation of capital and other policy errors. When domestic financial systems fail for lack of adequate institutional infrastructures, the solution is not to turn back to a less turbulent, but also less prosperous, past regime of capital controls, but to strengthen the domestic institutions that are the prerequisite for engaging in today’s international financial system. Blocking the exodus or repatriation of capital, as some of the newer participants in the international financial system appear inclined to do after they get into trouble, is, of course, the equivalent of the economy receiving a subsidized injection of funds. If liquidity is tight, the immediate effect of controls can be relief from the strain of meeting obligations and a temporary sense of well-being. This is an illusion however. The obvious consequence of confiscating part, or all, of foreign investors’ capital and/or income, is to ensure a sharp reduction in the availability of new foreign investment in the future. The presumption that controls can be imposed temporarily, while an economy stabilizes, and then removed, gives insufficient recognition to the imbalances in an economy that emerge when controls are introduced. Removing controls subsequently creates its own set of problems, which most governments, inclined to impose controls in the first place, are therefore loathe to do. Indeed, controls are often employed to avoid required -- but frequently politically difficult -- economic adjustments. There are many examples in history of controls imposed and removed, but rarely without great difficulty and cost. To be sure, any economy can operate with its borders closed to foreign investment. But the evidence is persuasive that an economy deprived of the benefits of new technologies, and inhospitable to risk capital, will be mired at a suboptimal standard of living and slow growth rate associated with out-of-date technologies. It is often stipulated that while controls on direct foreign investment and its associated technology transfer are growth inhibiting, controls on short-term inflows do not adversely affect economic welfare. Arguably, however, the free flow of short-term capital facilitates the servicing of direct investments as well as the financing of trade. Indeed, it is often difficult to determine whether certain capital flows are direct investments or short term in nature. Chile is often cited as an example of the successful use of controls on short-term capital inflows. But in response to the most recent international financial turmoil, Chile has chosen to lower its barriers in order to encourage more inflows. Those economies at the cutting edge of technology clearly do not need foreign direct investment to sustain living standards and economic growth. The economy of the United States in the 1950s, for example, needed little foreign investment and yet was far more dominant in the world then, than it is today. That was a major change from our experiences of the latter half of the nineteenth century, when the vast amount of investment and technology from abroad played a significant role in propelling the US economy to world-class status. Even today, though we lead the world in many of the critical technologies, we still need to borrow a substantial share of the mobile pool of world savings to finance our level of domestic investment. Were we unable to do so, our standard of living would surely suffer. But the inflow of foreign capital would be much reduced if there were uncertainties about whether the capital could be freely repatriated. While historically there could be considerable risk in American investments -- for example, some nineteenth century investments in American railroads entailed large losses -- the freedom of repatriation and the sanctity of private contracts were, with rare exceptions, secure. Our experiences, and those of others, raise the question of the sustainability of free international capital flows when the conditions fostering and protecting them are impaired or absent. Specifically, an economy whose private and/or public sectors have become heavy net debtors in foreign currency is at risk of default, especially when its exchange rate unexpectedly moves adversely. Clearly, should default become widespread among a number of economies, the flow of international capital to other economies perceived as potentially in similar circumstances will slow and in certain instances reverse. The withdrawal of the ongoing benefits of free flowing capital, as recent history has so amply demonstrated, often can be abrupt and disruptive. The key question is obviously how do private sector entities and governments and, by extension, economies as a whole allow themselves through currency mismatches to reach the edge of insolvency? Indeed, where was the appropriate due diligence on the part of foreign investors? Investors will, on occasion, make misjudgments, and borrowers will, at times, misread their capabilities to service debt. When market prices and interest rates adjust promptly to evidence of such mistakes, the consequences of the mistakes are generally contained and, thus, rarely cumulate to pose significant systemic risk. There was some evidence of that process working in the latter part of the nineteenth century and early twentieth century when international capital flows were largely uninhibited. Losses, however, in an environment where gold standard rules were tight and liquidity constrained, were quickly reflected in rapid increases in interest rates and the cost of capital generally. This tended to delimit the misuse of capital and its consequences. Imbalances were generally aborted before they got out of hand. But following World War I such tight restraints on economies were seen as too inflexible to meet the economic policy goals of the twentieth century. From the 1930s through the 1960s and beyond, capital controls in many countries, including most industrial countries, inhibited international capital flows and to some extent the associated financial instability -- presumably, however, at the cost of significant shortfalls in economic growth. There were innumerable episodes, of course, where individual economies experienced severe exchange rate crises. Contagion, however, was generally limited by the existence of restrictions on capital movements that were at least marginally effective. In the 1970s and 1980s, recognition of the inefficiencies associated with controls, along with newer technologies and the deregulation they fostered, gradually restored the free flow of international capital prevalent a century earlier. In the late twentieth century, however, fiat currency regimes have replaced the rigid automaticity of the gold standard in its heyday. More elastic currencies and markets, arguably, are now less sensitive to and, hence, slower to contain the misallocation of capital. Market contagion across national borders has consequently been more prevalent and faster in today’s international financial markets than appears to have been the case a century ago under comparable circumstances. As I pointed out before this Committee almost a year ago, a good part of the capital that flowed into East Asia in recent years (largely in the 1990s) probably reflected the large surge in equity prices in most industrial economies, especially in the United States. The sharp rise induced a major portfolio adjustment out of then perceived fully priced investments in western industry into the perceived bargain priced, but rapidly growing, enterprises and economies of Asia. The tendency to downplay the risks of lending in emerging markets, reinforced by the propensity of governments explicitly or implicitly to guarantee such investments in a number of cases, doubtless led to an excess of lending that would not have been supported in an earlier age. As I also pointed out in previous testimony, standards of due diligence on the part of both lenders and borrowers turned somewhat lax in the face of all the largess generated by abundant capital gains and all the optimism about the prospects for growth in the Asian region. The consequent emergence of heavy losses and near insolvency of a number of borrowing banks and nonfinancial businesses engendered a rush by foreign capital to the exits and induced severe contractions in economies with which borrowers and policymakers were unprepared and unable to cope. At that point the damage to confidence and the host economies had already been done. Endeavors now to block repatriation of foreign funds, while offering temporary cash flow relief, have significant long-term costs and clearly should be avoided, if at all possible. I recognize that if problems are allowed to fester beyond the point of retrieval, no market-oriented solution appears possible. Short-term patchwork solutions to achieve stability are presumed the only feasible alternatives. When that point is reached, an economy is seen as no longer having the capability of interacting normally with the international financial system, and is inclined to withdraw behind a wall of insulation. It must be remembered, however, that the financial disequilibria that caused the initial problems would not have been addressed. Unless they are, those problems will reemerge. As I implied earlier with respect to the nineteenth century American experience, there are certain conditions precedent to establishing a viable environment for international capital investment, one not subject to periodic systemic crises. Some mechanism must be in place to enhance due diligence on the part of lenders, but especially of borrowers individually and collectively. Losses of lenders do on occasion evoke systemic risks, but it is the failure of borrowers to maintain viable balance sheets and an ability to service their debts that creates the major risks to international stability. The banking systems in many emerging East Asian economies effectively collapsed in the aftermath of inappropriate borrowing, and large unhedged exposures, in foreign currencies. Much will be required to bolster the fragile market mechanisms of many, but certainly not all, economies that have recently begun to participate in the international financial system. Doubtless at the head of the list is reinforcing the capabilities of banking supervision in emerging market economies. Conditions that should be met before engaging in international borrowing need to be promulgated and better monitored by domestic regulatory authorities. Market pricing and counterparty surveillance can be expected to do most of the job of sustaining safety and soundness. The experience of recent years in East Asia, however, has clearly been less than reassuring. To be sure, lack of transparency and timely data inhibited the more sophisticated risk evaluation systems from signaling danger. But that lack itself ought to have set off alarms. As one might readily expect, today’s risk evaluation systems are being improved as a consequence of recent failures. Just as importantly, if not more so, unless weak banking systems are deterred from engaging in the type of near reckless major international borrowing that some systems in East Asia engaged in during the first part of the 1990s, the overall system will continue at risk. A better regime of bank supervision among those economies with access to the international financial system needs to be fashioned.1 In addition, the resolution of defaults and workout procedures require significant improvements in the legal infrastructures in many nations seeking to participate in the international financial system.2 None of these critical improvements can be implemented quickly. Transition support by the international financial community to those in difficulty will, doubtless, be required. Such assistance has become especially important since it is evident from the recent unprecedented swings in currency exchange rates for some of the emerging market economies that the international financial system has become increasingly more sensitive than in the past to shortcomings in domestic banks and other financial institutions. The major advances in technologically sophisticated financial products in recent years have imparted a discipline on market participants not seen in nearly a century. Whatever international financial assistance is provided must be carefully shaped not to undermine that discipline. As a consequence, any temporary financial assistance must be carefully tailored to be conditional and not encourage undue moral hazard. It can be hoped that despite the severe trauma that most of the newer participants in the international financial system are currently experiencing, or perhaps because of it, improvements will emerge to the benefit, not only of the emerging market economies but, of the long-term participants of the system as well. Parenthetically, a century ago, banks were rarely subsidized and, hence, were required by the market to hold far more capital than they do now. In today’s environment, bank supervision and deposit insurance have displaced the need for high capital-asset ratios in industrial countries. Many of the new participants in the international financial system have had neither elevated capital, nor adequate supervision. This shortfall is now generally recognized and being addressed. There are, of course, other reforms that I believe need to be addressed. These were outlined in my earlier testimonies before this Committee.
|
board of governors of the federal reserve system
| 1,998 | 9 |
Testimony of Mr. Edward W. Kelley, Jr., a member of the Board of Governors of the US Federal Reserve System, on the Year 2000 progress of the banking and financial services sector, before the Committee on Banking and Financial Services of the US House of Representatives on 17/9/98.
|
Mr. Kelley reports on the Year 2000 progress of the banking and financial services sector in the United States Testimony of Mr. Edward W. Kelley, Jr., a member of the Board of Governors of the US Federal Reserve System, on the Year 2000 progress of the banking and financial services sector, before the Committee on Banking and Financial Services of the US House of Representatives on 17/9/98. Thank you for again inviting me to appear before this Committee to discuss the Year 2000 issue. This problem poses a major challenge to public policy: the stakes are enormous, nothing less than the preservation of a safe and sound financial system that can continue to operate in an orderly manner when the clock rolls over at midnight on New Year’s Eve and the millennium arrives. The Year 2000 problem will touch much more than just our financial systems and could temporarily have adverse effects on the performance of the overall US economy as well as the economies of many, or all, nations if not corrected. As I said last April in testimony before the Senate Committee on Commerce, Science, and Transportation, some of the more adverse scenarios are not without a certain plausibility, if this challenge were being ignored. But it is not being ignored. While it is impossible today to precisely forecast the impact of this event, and the range of possibilities runs from minimal to extremely serious, an enormous amount of work is being done in anticipation of the rollover of the millennium, and I am optimistic that this work will pay off. In that spirit, let me update you on what the Federal Reserve has done to address the Year 2000 issue. Since I last testified here in November 1997, the Federal Reserve has met the goals that we set for ourselves. We have: • renovated our mission-critical applications and nearly completed our internal testing; • opened our mission-critical systems to customers for testing; • progressed significantly in our contingency planning efforts; • implemented a policy concerning changes to our information systems; and • concluded our initial review of all banks subject to our supervisory authority. While these accomplishments are indicative of our significant progress in addressing the Year 2000 issue, much work remains. In the testing area, we are finalizing plans for concurrent testing of multiple mission-critical applications by customers. We are coordinating with the Clearing House for Interbank Payment Systems (CHIPS) and the Society for Worldwide Interbank Financial Telecommunications (SWIFT) to provide a common test day for customers of Fedwire and these two systems. We have a System-wide project underway to enhance our contingency plans to address failures external to the Federal Reserve. We are conducting a second round of supervisory reviews of banks subject to our supervisory authority and also actively coordinating with various domestic and international Year 2000 organizations. This morning I would like to discuss these achievements and the important aspects of the job that remain ahead of us. Federal Reserve Readiness The Federal Reserve has completed the renovation of its mission-critical systems, and we are nearing the conclusion of our internal Year 2000 testing efforts. Internal testing includes both individual applications and application interfaces, such as the exchange of data between Fedwire and our Integrated Accounting System. Testing is conducted through a combination of two elements: one is future-dating our computer systems to verify the readiness of our information technology, and the other is testing critical future date processing within our business applications. Communications network components are also being tested and certified in special test lab environments at the Federal Reserve Automation Services and the Board of Governors. The Reserve Banks and the Board have implemented test century date change (CDC) local area networks to verify the readiness of vendor provided products and internal applications that operate in network-based computing environments. With the exception of a few systems that will be replaced by March 1999, we will complete the testing activities and implement our mission-critical applications in production by year-end 1998. On June 29, 1998, we made our future-dated test environment available to customers for Year 2000 testing; the crucial testing period will extend through 1999. Depository institutions which are Federal Reserve customers, and thus rely on our payment applications such as Fedwire, Fed ACH, and check processing systems, can test century rollover and leap year transactions six days a week. On six weekends this fall, depository institutions will be able to test Year 2000 test dates with several applications simultaneously. We are providing assistance to our customers who test with us, and have provided them, through a series of Century Date Change bulletins, with technical information and guidelines concerning the testing activity. By the end of August, almost 400 customers, including the US Treasury, had conducted CDC testing with the Federal Reserve, and the number scheduling tests is increasing rapidly. These tests encompass ten of our mission-critical applications. To ensure that the nation’s larger banks are taking advantage of this testing opportunity, we intend to contact any that have not availed themselves of this service. Several foreign banking organizations have begun to test large dollar payment systems with the Federal Reserve and CHIPS. Most large foreign banks will participate in the September 26, 1998, coordinated test in which Fedwire, CHIPS, and SWIFT are participating. Like most information technology environments, ours are composed primarily of vendor hardware and software products. To assess the Year 2000 readiness of these environments, as well as our building systems such as vault and climate control systems, we have created an automated inventory of the vendor products that we use and are tracking the Year 2000 compliance status of those products. Although the Federal Reserve has made progress in independently testing vendor products, we will continue these efforts. In prior testimony, I have noted the critical dependence of banks on telecommunication services and the need to ensure the readiness of telecommunication service providers. To foster a better understanding of the importance of information sharing by the telecommunications industry, I wrote to Federal Communications Commissioner Powell about this issue. Commissioner Powell has been responsive and has provided us and others with information regarding the FCC’s oversight and plans. The Federal Reserve is also participating in the Telecommunications Sector Group of the President’s Council on Year 2000 Conversion. In addition, the Federal Reserve is monitoring telecommunication carriers to assess whether those used by the Federal Reserve will be Year 2000 compliant. We are pleased with the responsiveness of the telecommunications industry: plans for industry testing are well under way, with participation from the major service providers, as well as their suppliers. Contingency Planning As the nation’s central bank, the Federal Reserve is actively engaged in contingency planning for both operational disruptions and systemic risks. An internal CDC Council consisting of senior managers is coordinating contingency planning across the Federal Reserve’s various functions and is fostering a cohesive approach to internal readiness and interaction with the financial community. In general, the banking industry’s focus has also progressed from the renovation of systems to business continuity and contingency planning. Contingency planning for the Federal Reserve includes payments systems, currency availability and distribution, information processing, the discount window, and the supervision function. We will also play an important role in coordinating with the financial community concerning issues such as systemic risk and cross-border payments. Operational Contingency The Federal Reserve’s plans for operational continuity build on existing contingency plans. As you know, we have long maintained comprehensive contingency plans that are routinely tested and have been implemented during natural disasters and other disruptions. These plans cover our internal systems, as well as the services we provide to depository institutions. In June 1998, each of the Federal Reserve’s business offices completed assessments of the adequacy of existing contingency scenarios to address CDC risks. Federal Reserve CDC contingency workgroups are identifying problems external to the Federal Reserve that may arise when the date changes to 2000, such as those affecting telecommunications providers, large financial institutions, utility companies, other key financial market participants, and difficulties abroad that affect US markets or institutions. The workgroups are developing corresponding recommendations to mitigate those problems. The Federal Reserve plans to finalize contingency plans reflecting these recommendations by November 30, 1998. We will continue to refine our CDC contingency plans as necessary throughout 1999. In fourth quarter 1998, we will focus our efforts on how to test our contingency plans to ensure their operational effectiveness at the century rollover. Change Management As a part of our operational readiness planning, the Federal Reserve is developing procedures to manage the risks posed by changes to information systems in 1999 and the first quarter of 2000. After the scheduled completion of testing and implementation of our critical applications, changes to Federal Reserve policies, rules, regulations, and services that generate changes to critical information systems create the risk that those systems may no longer be CDC compliant. Consequently, we have established guidelines to significantly limit policy and operational changes, as well as internal hardware and software changes, during late 1999 and early 2000 in order to minimize the risks and complexities of problem resolution associated with the introduction of new processing components. By limiting changes to our systems, we will not only provide a stable internal processing environment entering the Year 2000, but we will also minimize changes that our customers could be required to make to their applications that interface with our software. In addition, we intend to aggressively coordinate with other institutions that typically generate policy and operational changes in the financial industry. We intend to publish our guidelines to assist other organizations facing similar issues and I would urge that Congress, as well as other federal agencies, consider adoption of such change management policies as we move into 1999. Currency As I noted earlier, cash availability and processing is an issue we have considered in the contingency planning process. We have regularly met the public’s heightened demand for US currency in peak seasons or in extraordinary situations, such as natural disasters. We recently submitted our fiscal year 1999 currency printing order to the Department of the Treasury’s Bureau of Engraving and Printing and we increased the size of next year’s print order due to Year 2000 considerations. With this order, we will substantially increase the amount of currency either in circulation or in Federal Reserve vaults over current levels by late 1999. We believe this increase in the level of currency should be ample to meet the public’s demand for extra cash during the period surrounding the century rollover. This is a precautionary step on our part -- we believe it is prudent to print more currency than we think will be required than to risk not printing enough. While we do not anticipate any extraordinary demand for cash, we believe it is important that the public have complete confidence that sufficient supplies of currency will be available. In effect, the Federal Reserve is accelerating the timing of currency printing; we are planning for a possible short-lived increased demand for cash and will be able to reduce future print orders to lower-than-normal levels. As we monitor the public’s demand for currency, we can introduce other measures to further increase cash levels. First, the recent currency order with the Bureau of Printing and Engraving is for the federal fiscal year 1999, so that there will be time to print additional notes in the last three months of 1999. Second, we can change the print order to increase production of higher denomination notes. Third, we can increase staff in Reserve Bank cash operation functions to improve the turnaround time required to process cash deposits and move currency back into circulation. Finally, as a last resort, we can hold off the destruction of old or worn currency. Liquidity Another contingency planning issue for the Federal Reserve is liquidity. Despite their best efforts, some depository institutions could encounter problems in the rollover in maintaining reliable computer systems, and these problems may or may not affect their funding positions. To the extent necessary, the Federal Reserve is prepared to lend, in appropriate circumstances and with adequate collateral, to depository institutions when market sources of funding are not reasonably available. The terms and conditions of such lending may depend upon the circumstances causing the liquidity shortfall. Financial Sector Initiatives The Federal Reserve and private industry have intensified cooperative efforts to address contingency planning. The Year 2000 Contingency Planning Working Group of the New York Clearing House (NYCH), the Securities Industry Association (SIA), and the Federal Reserve are developing coordinated contingency plans for the Year 2000 and will act as liaison with other industry groups addressing contingency planning on behalf of banks, securities firms, exchanges, clearance and settlement organizations, regulators, and international markets. Among other things, the Working Group is considering plans for the establishment of Year 2000 communications centers throughout the country, and perhaps internationally. Primarily, such centers would facilitate the exchange of up-to-date information on developing problems and issues among participants and enhance the development of consensus, when necessary, to coordinate timely responses to problems. The Federal Reserve is assisting in the government’s coordination of the Year 2000 effort within the financial industry by participating in the Financial Institutions Sector Group of the President’s Council on Year 2000 Conversion. A senior Board official who chairs this Sector Group has been working with representatives of government financial organizations, including the federal banking agencies, the Department of the Treasury, the Securities and Exchange Commission, and other agencies responsible for various financial intermediaries, to assess the Year 2000 readiness of the financial industry and formulate strategies for addressing interagency Year 2000 issues. Bank Supervision I would now like to turn to our industry oversight activities. The Federal Reserve met its goal, set in May 1997, of conducting a Year 2000 supervisory review of all banks subject to our supervisory authority by June 1998. This public commitment and visible effort did much to stimulate industry action on the Year 2000 issue. We have also established ties and are providing significant support to numerous public and private groups, both domestic and international, that are addressing the Year 2000 readiness of their respective constituencies. As part of our outreach program, we continually emphasize the critical significance of ensuring that computer systems and applications are Year 2000 compliant and the complexity of the managerial and technological challenges that the required effort presents for all enterprises. For entities such as financial institutions that rely heavily on computers to provide financial services to customers, achieving Year 2000 compliance in mission-critical systems is essential for maintaining the quality and continuity of customer services. While bank supervisors can provide guidance, encouragement, and strong formal and informal supervisory incentives to the banking industry to address this challenge, it is important to recognize that we cannot be ultimately responsible for ensuring or guaranteeing the Year 2000 readiness and viability of the banking organizations we supervise. Rather, the boards of directors and senior management of banks and other financial institutions must be responsible for ensuring that the institutions they manage are able to provide high quality and continuous services from the first day in January of the Year 2000. As we have emphasized continually during the past sixteen months, this critical obligation must be among the very highest of priorities for bank management and boards of directors. Policy Guidance and Supervisory Reviews The Federal Reserve continues to work closely with the other banking agencies that comprise the Federal Financial Institutions Examination Council (FFIEC) to address the banking industry’s Year 2000 readiness. A series of seven advisory statements has been issued since I was last here in November 1997, including statements on the nature of Year 2000 business risk, the importance of service provider readiness, the means to address customer risk, the need to respond to customer inquiries, the critical importance of testing, the urgency of effective contingency planning, and the need to address the readiness of fiduciary activities. As a result of these advisory statements, the extent of the industry’s Year 2000 efforts has significantly intensified. These statements can be found in their entirety at http://www.federalreserve.gov/Y2K. Compliance with these statements is assessed during the conduct of supervisory reviews. Through June 30, the Federal Reserve had conducted reviews of approximately 1,600 organizations. Information and data collected during these reviews has proven to be reliable, consistent with our overall supervisory experience which is heavily dependent on an extensive on-site examination program. These reviews have resulted in a significant focus of attention on the subject matter within the industry and identified several issues warranting additional attention by the supervisors, particularly the need for the supplemental guidance on testing and contingency planning. It is critical that banks avail themselves of every opportunity to test mission-critical systems internally and with their counterparties, and recurring testing may be warranted as additional systems are renovated to assure that those systems already tested are not adversely affected. Based on the reviews completed by the Federal Reserve, the vast majority of banking organizations are making satisfactory progress in their Year 2000 planning and readiness efforts. About four percent are rated “needs improvement” and fewer than one percent are rated “unsatisfactory”. In these cases, the Federal Reserve has initiated intensive supervisory followup. Working closely with state banking departments, the Federal Reserve is making a concerted effort to focus additional attention on those particular banking organizations that are deemed deficient in their Year 2000 planning and progress. Deficient organizations have been informed of their status through supervisory review comments, meetings with senior management or the board of directors, and deficiency notification letters calling for submission of detailed plans and formal responses to the deficiencies noted. Such organizations are then subject to increased monitoring and supervisory follow-up including more frequent reviews. Restrictions on expansionary activities by Year 2000 deficient organizations have also been put into place. As a result of these letters, organizations once deemed deficient have taken significant steps to enhance their Year 2000 programs and over half have been upgraded to satisfactory. The Federal Reserve has commenced Phase II of its supervision program which covers the nine months from July 1998 through March 1999. During this second phase, we will conduct another round of supervisory reviews focused on Year 2000 testing and contingency planning. In addition, we have committed to conducting another review of the information systems service providers and will distribute the results to the serviced banks. Assessment of Review Results Based on these reviews and other interactions with the industry, it appears that financial institution progress in renovating mission-critical systems has advanced notably since the Federal Reserve and the other banking agencies escalated efforts to focus the industry’s attention on ensuring Year 2000 readiness. Banking organizations are making substantial headway toward Year 2000 readiness and, with some exceptions, are on track to meet FFIEC guidelines. Specifically, the FFIEC guidelines call for the completion of internal testing of mission critical systems by year-end 1998. Most large organizations are nearing completion of the renovation of their mission-critical systems and are vigorously testing those that have been renovated. Smaller organizations are working closely with their service providers in an effort to confirm that the efforts under way will assure the readiness and reliability of the services and products on which they depend. Information Systems Service Providers The banking agencies are examining the Year 2000 readiness of certain information systems service providers and software vendors that provide services and products to a large number of banking organizations. These examinations are often conducted on an interagency basis. The Federal Reserve has participated in reviews of sixteen national service providers and twelve national software vendors. In addition, the banking agencies are examining selected regional service providers and software vendors. These examinations assess the overall Year 2000 program management of the firms and confirm their plans to support products for Year 2000 readiness. To help banking organizations assess the Year 2000 readiness and dependability of their service providers and to encourage examined service providers to cooperate fully with the industry’s efforts, the banking agencies are distributing the results of Year 2000 reviews of service providers and software vendors to the serviced banks. The information in the reports is not a certification of the readiness of the service provider, and it is still incumbent on each bank to work closely with its service providers to prepare for the century date change. Service providers and software vendors serving the banking industry are keenly aware of the industry’s reliance on their products and services, and most consider their Year 2000 readiness to be their highest priority in order for them to remain competitive in an aggressive industry. Credit Quality FFIEC guidelines call for banks to have a plan to assess customer Year 2000 readiness by June 30, 1998, and to complete an assessment of their customers’ Year 2000 readiness by September 30, 1998, in order to better understand the risks faced by the bank if customers are unable to meet their obligations on a timely basis. Even though our Phase I Year 2000 examinations were conducted before the June 30, 1998, milestone, our examiners noted that most organizations either had begun planning or had initiated their customer assessment programs. We have seen no signs that credit quality has deteriorated as a result of Year 2000 readiness considerations, although it is still early. Results from a Federal Reserve Senior Loan Officer Opinion Survey on Bank Lending Practices (May 1998) indicated that respondents generally include Year 2000 preparedness in their underwriting and loan review standards. Another survey of Senior Loan Officers is to be conducted in November in order to obtain a more timely picture of any deterioration in credit quality related to the Year 2000. Efforts have also been made to prompt the nation’s largest banks that syndicate large loans to address the Year 2000 readiness of their borrowers. Through the Shared National Credit Program, banks that syndicate credits over $20 million are asked to provide the banking agencies with information pertaining to the banks’ efforts to assess the readiness of the borrowers. This initiative has helped large lenders understand that they need to consider their customers’ readiness in their risk management programs. Additional Outreach Initiatives The Federal Reserve is participating in numerous outreach initiatives with the banking industry, trade associations, regulatory authorities, and other groups that are hosting conferences, seminars, and training opportunities focusing on the Year 2000 and helping participants understand better the issues that need to be addressed. Partly in response to the requirements of the Examination Parity and Year 2000 Readiness for Financial Institutions Act, which calls for the banking agencies to conduct seminars for bankers on the Year 2000, the federal banking agencies have been working with state banking departments as well as national and local bankers’ associations to develop coordinated and comprehensive efforts at improving the local and regional programs intended to focus attention on the Year 2000. In the first six months of 1998, the Federal Reserve has participated in over 230 outreach initiatives reaching over 14,000 bankers. Another 100 outreach initiatives are scheduled for the third quarter of 1998. In addition, our public web site provides extensive information on our Year 2000 supervision program and on other resources available to the industry to help prepare for the millennium. International Coordination International cooperation on Year 2000 has intensified over the past several months because of the efforts of the public sector Joint Year 2000 Council (Joint Council) and the private sector Global 2000 Coordinating Group (G-2000). As you recall from your hearings on international issues in June, Mr. Ernest Patrikas, then Chairman of the Joint Council and a senior official of the Federal Reserve Bank of New York, testified on the global efforts of the Joint Council to enhance international initiatives by financial regulators. His successor as Chairman of the Joint Council, Federal Reserve Governor Roger Ferguson, is continuing those efforts and is working closely with an external consultative committee composed of representatives from international financial services providers, international financial market associations, financial rating agencies, and a number of other international industry associations. The Joint Council is working to foster better awareness and understanding of Year 2000 issues on the part of regulators around the world; for example, the Joint Council is sponsoring a series of regional seminars for banking, insurance, and securities supervisors. The G-2000 includes more than forty financial institutions from over twenty countries that are addressing country assessments, testing, and contingency planning, as well as other issues. The group has developed a standard framework for assessing individual country preparations for the century date change and also will address the Year 2000 readiness of financial institutions, service providers, and the countries’ infrastructures. This framework is being used to collect information and assess the readiness of about twenty major countries by the end of this year, with others scheduled for in-depth reviews in 1999. The Basle Committee on Banking Supervision continues to be active on Year 2000 issues, both within the Joint Council and separately, as part of its normal supervisory activities. Year 2000 will be a major issue to be discussed at the International Conference of Bank Supervisors in October. The Basle Committee is also planning a follow-up survey on Year 2000 progress. Closing Remarks Financial institutions have made significant progress in renovating their systems to prepare for the Year 2000 and much has been accomplished to ensure the continuation of reliable services to the banking public at the century rollover. We are committed to a rigorous program of industry testing and contingency planning and, through our supervisory initiatives, to identifying those organizations that most need to apply additional attention to Year 2000 readiness. The Federal Reserve has renovated its mission-critical applications, and we are nearing the completion of our internal testing activities. To manage the risks posed by subsequent changes to these systems, the Federal Reserve has instituted guidelines to significantly limit policy, operational, hardware, and software changes during late 1999 and early 2000. Going forward, we will continue our industry and international coordination efforts, including participation in the President’s Council on Year 2000 Conversion, the Joint Year 2000 Council, and trade associations, to assist the industry in preparing for the Year 2000. In closing, I would like to thank the Committee for its extensive efforts to focus the industry’s attention on this significant matter. Awareness of the extent and importance of this challenge is a critical first step in meeting it, and the Committee’s participation has been most helpful.
|
board of governors of the federal reserve system
| 1,998 | 9 |
Remarks by Mr. Laurence H. Meyer, a member of the Board of Governors of the US Federal Reserve System, at the Financial Institutions Center, University of Tennessee, Knoxville, on 18/9/98.
|
Mr. Meyer reflects on recent developments in banking and financial markets: implications for bank supervision and regulation Remarks by Mr. Laurence H. Meyer, a member of the Board of Governors of the US Federal Reserve System, at the Financial Institutions Center, University of Tennessee, Knoxville, on 18/9/98. The banking and financial services industries have been undergoing rapid change in recent years. These changes include the consolidation and increased geographic scope of the banking industry, a blurring of the distinctions between various financial institutions, deregulation, and the implementation of financial strategies made possible by improvements in computer technology and advances in finance theory. These changes in turn have created new opportunities and risks for financial institutions and new challenges for supervisors and regulators. My focus today will be on how legislation, supervisory practice and regulatory standards have to adapt to these changes in order to promote competition for financial services and maintain stable financial systems. Markets are moving very rapidly, while legislative and regulatory changes are occurring much more slowly. It is important that both legislation and regulatory standards quickly catch up to the changes in the market place in order to maximize the benefits to consumers and minimize the risks to financial stability. There are three ways in which recent developments have put pressure on legislation, regulatory standards and supervisory practice. First, the job of supervision has simply become more challenging as a result of the increased size, geographic scope, complexity, globalization and diversity of banking organizations. Second, the effectiveness of one of the foundations of the current regulatory framework, the risk-based capital rules, has eroded as improvements in credit risk measurement facilitated increased use of securitization and credit derivatives to arbitrage those capital rules. These innovations have allowed banks to reduce the capital charges for a given set of risks, making reported risk-based capital ratios increasingly less meaningful and the current standards progressively less effective. Third, banks and other financial institutions, often with the help of their regulators, have found ways, albeit within statutory limits, to expand not only the activities in which they engage, but also to compete more effectively with one another. Market developments and regulatory actions have created tensions with the existing legislative framework underpinning banking and financial services. I will focus on three policy responses. First, I will discuss the potential for increased market discipline to raise the overall effectiveness of the current regulatory and supervisory framework, responding to the challenge of supervising such large and complex institutions, without more burdensome and intrusive regulations. Second, I will emphasize the priority that needs to be attached to updating and refining our capital standards. Third, I will discuss the importance of financial modernization legislation. Before tackling the policy questions, let me begin by reviewing why and how banks are regulated. Why Do We Regulate Banks? Why do we supervise and regulate banks? The reasons are very straightforward. Initially, when banks were the dominant financial institutions, the major purpose was to reduce “systemic risk” and the impact on the economy of bank failure. Then as the safety net was created to reduce systemic risk, the resultant moral hazard created an even more important reason to supervise and regulate banks. Historically, the objective was to protect against a panic-driven flight to currency, or bank runs, that caused a catastrophic decrease in the money supply and the collapse of financial intermediation. Today, largely because of deposit insurance and the Federal Reserve discount window, flights to currency are not a real concern in the United States. But liquidity and solvency problems at large banks and other financial institutions can create systemic concerns, and while banks are no longer “special” the large banks do play a central role in our financial system. Indeed, the stability of the electronic, large dollar payments system, which moves trillions of dollars a day and in which banks play a pivotal role, is critical in limiting systemic risk. Other potential pressure points, in all of which banks play a key role, include the liquidity of securities, financial derivatives, and interbank funding markets. Our very success at virtually eliminating the risk of bank runs in the United States has led to a second major reason for supervising and regulating banks. Deposit insurance, the discount window, and Federal Reserve payment system guarantees -- the very things that have eliminated bank runs -- create what is called a “safety net” for banks. The existence of this safety net gives the Government a direct stake in keeping bank risks under control, just as a private insurance company has a stake in controlling the risks of policyholders. Because deposit insurance and other parts of the safety net can never be fully and accurately priced, it is necessary for us to monitor and sometimes to act to control bank risks in order to protect the potential call on taxpayer funds. An equally important, if unintended, consequence of the safety net is that it creates what economists term “moral hazard” incentives for some banks to take excessive risks. That is, the safety net creates incentives for banks to take larger risks than otherwise, because the safety net, and potentially taxpayers, may absorb most of the losses if the gamble fails. Such incentives are especially strong if the bank is near failure since, at this point, bank stockholders have virtually nothing to lose. How Do We Regulate Banks? Bank regulation and supervision can be thought of as a portfolio of regulatory rules and supervisory practices that collectively aim to monitor and protect the safety and soundness of the banking system, so as to control systemic risk, offset moral hazard incentives, and limit the potential call on the taxpayer. This portfolio includes, in no particular order, (1) the set of rules and practices relating to the federal safety net itself; (2) the requirements for disclosure and the incentives that promote market discipline of banking organizations; (3) the rules that govern the capital that banking organizations must hold; (4) the limitations on activities that banking organizations may engage in; (5) the restrictions on where certain activities can be conducted within banking organizations; (6) prudential restrictions on the relationship between banks and their affiliates; and (7) the nature and details of the supervision of banking organizations. The more effective is market discipline and the more effective are capital standards, the less there is a need to invoke the more intrusive and burdensome components of the regulatory portfolio. For this reason, I focus particularly on enhanced market discipline and reform of capital standards. But financial modernization legislation has also focused attention on the role of activity limitations, restrictions on where activities are conducted within the banking organization, and the appropriate supervisory framework for more diversified financial service organizations that would emerge from such legislation. Market Discipline: The Role of Subordinated Debt To a significant extent, market discipline and regulatory standards are substitutes. The more effective the former, the less reliance has to be made of the latter. Because the regulatory approach involves significant costs to both the private and public sectors, increased reliance on the market can reduce the cost of achieving a given degree of safety and soundness in the banking system and thereby increase the efficiency of the overall regulatory framework. But there is a significant trade-off to worry about. A greater reliance on the market may also increase systemic risk. For example, eliminating deposit insurance would certainly increase the incentive for depositors to monitor the risk profiles of banks. The price for such increased market discipline, however, would, in my view, be unacceptably high -- a significantly increased risk of bank runs and hence systemic risk. The key therefore is to find opportunities to increase market discipline in a way that results in an acceptable trade-off with respect to systemic risk and thereby to achieve the optimal balance of market discipline and regulatory standards. Market discipline involves information and incentives. Markets need access to information to judge the risk profiles of banking organizations. But, to make market discipline work, there also have to be incentives for banks to respond to the changing judgment of the market. It may be possible to increase market discipline by requiring large, internationally active banks to issue a minimum amount of certain types of subordinated debt to the public. An appealing aspect of this approach is that subordinated debt holders, so long as they are not bank “insiders”, face only downside risk, and thus their risk preferences are very close to those of the FDIC. Subordinated debt holders would therefore be expected to impose market discipline on the bank that is quite consistent with what bank supervisors are trying to do, including encouraging banks to disclose more information about their financial condition. Observed risk premiums on subordinated debt could perhaps be used to help the FDIC set more accurate risk-based deposit insurance premiums, and such debt would provide an extra cushion of protection for taxpayers. An additional benefit of having subordinated debt traded on the open market is that price movements would provide a clear signal of the market’s evaluation of the bank’s financial condition that, even if it were not used to help price deposit insurance, could serve as an early warning aid to supervisors. Such an approach would most likely be limited to the largest banks for at least two reasons. First, it is the increased size of banking organizations and the financial innovations at the largest and most sophisticated banks that are challenging the effectiveness of the current regulatory and supervisory framework. It may therefore be appropriate, even essential, to differentiate the regulatory standards and supervisory practices at large, sophisticated, and internationally active banks -- the predominant sources of systemic risk -- from those that apply to small and medium-sized banks. Second, it is unclear just how deep and liquid a market in bank subordinated debt would be and what access small and medium sized banks would have to this market. Today, almost no smaller banks, for example, issue subordinated debt, while a majority of the largest banks already participate to some degree in this market. However, even at the largest banks that participate, most of their subordinated debt appears to be held by their holding company, not by independent third parties. This suggests that the development of an operationally feasible program for mandatory subordinated debt would require a considerable amount of careful thought. Still, in my judgment it is thought that might prove very worthwhile. Reform of the Basle Accord: Updating Risk-Based Capital Rules Capital supplements the discipline from “outsiders” -- the market -- by reinforcing the incentives of “insiders” -- the owners. Capital standards have therefore long been the foundation of the regulatory approach to safety and soundness in the banking system. The current capital standards are based on the Basle Accord adopted in 1988 by most of the world’s industrialized nations in an effort to encourage stronger capital at riskier banks, to include off-balance sheet exposures in the assessment of capital adequacy, and to establish more consistent capital standards across nations. The Accord was a major advance in 1988, and has proved to be very useful since then. But in recent years calls for reform have begun to grow. The Basle Accord capital standard divides bank on- and off-balance-sheet assets into four risk buckets, and then applies a different capital weight to each bucket. These weights are intended to increase with the riskiness of the assets in a given bucket. However, the relationship is rough. Perhaps most troublesome, the same risk weight is applied to all non-mortgage loans. Thus, for example, a loan to a very risky “junk bond” company gets the same weight as a loan to a “triple A” rated firm. While the current capital rules fail to differentiate among the riskiness of loans on the banking book, banks are increasingly doing so, taking advantage of increasingly accurate models for measuring, managing, and pricing risk. These models allow banks to assign internal economic capital to the various loans and allow banks to identify those loans for which the economic capital is less than that required by the Basle Accord. Banks should then either want to remove loans from their banking books for which regulatory capital exceeds economic capital or otherwise find ways to make regulatory capital converge to economic capital. Conversely, banks should want to keep loans that their models assign higher capital than required by the Basle standard. And, banks have been doing just that, in part through innovations such as securitization of loans and credit derivatives. Banks use securitization, for example, to remove their lower risk loans -- which perhaps they otherwise would not originate -- from their banking books, reducing the associated capital charges. In a traditional securitization, a special purpose vehicle sells securities backed by a loan pool and uses the proceeds to purchase the loans from the bank. There might seem to be no problem with this arrangement, as long as banks shed the risks that go along with the loans. If this were the case, securitization would simply be a device for shifting to the capital markets the risks associated with the bank loans securitized. Banks would then continue their traditional role of originating the loans, but the capital markets would play an increased role in the funding of the loans. But, in reality, banks are required to keep part of the risk or provide some other enhancement in order to make the securities backed by the loans more acceptable to the capital market. Thus securitizations typically result in only a limited transference of risk. But the Basle capital rules typically reduce capital requirements against the retained bank exposure by more than banks lower the risk associated with the underlying loan pool. In addition, even if a bank completely transferred the risk associated with the underlying loan pool, eliminating the low-risk loans from the banking book raises the average risk level of the remaining loan portfolio, raising a question as to whether the 8% capital charge remains adequate. Credit derivatives are another technique for managing both risk and risk-based capital charges. Derivatives are ways of slicing and dicing the risks and returns of assets, so that the risks and returns can be allocated more efficiently among ultimate risk takers. With a credit derivative, a bank can transfer the risk and returns associated with a loan, even though from an accounting perspective the loan remains on the books of the bank. By using credit derivatives, banks can create “synthetic securitizations” to transfer risk and reduce risk-based capital requirements, while avoiding disclosure of an effective loan sale and possibly harming a customer relationship. Such arrangements are also typically less costly than traditional securitizations in terms of administrative, legal, and other structuring costs. As with traditional securitizations, the concern from a regulatory perspective is that a bank is encouraged to reduce the risk-based capital requirement associated primarily with higher quality assets. This raises questions about its overall capital adequacy, given the remaining risks in its portfolio. This so-called “regulatory arbitrage” may not be all bad. It arises, after all, in response to deficiencies of the capital standards themselves, specifically their failure to reflect differences in the risk of loans within and across banks. Given the intense competition among banks, in part due to the globalization of the banking industry, banks are under pressure to limit their capital and thereby raise their profitability. When regulatory capital requirements are significantly greater than the economic capital banks would otherwise assign to a given loan, banks either have to find ways to lower those capital charges, shift the risks associated with those loans elsewhere, or get out of the business of making such high quality loans. In the absence of such adjustments, the capital standards might otherwise drive banks out of the business of making high quality loans and therefore toward increased risk-taking. Regulatory capital arbitrage can therefore be a safety valve that allows banks to continue in the business of lending to high quality counterparties. But we must deal with both the problem of the degree to which the capital on the remaining credit risk at a bank associated with securitizations accurately reflects that risk. Perhaps even more critical is addressing the effect of both securitizations and credit derivatives for the capital associated with the risk level of the residual loan portfolio. A one-size fits all regulatory capital charge and the cherry-picking of securitizing low risks and retaining high risks is, I believe, our biggest problem with securitizations -- real and synthetic. To address these problems we need a two-track approach. In the near term, we need to implement rules that better match capital charges with the residual risk left after securitizations. This is a very complex problem. The banking agencies issued a proposed rulemaking to address this issue in November of 1997 and received many comments from the industry. A high priority in the near term must be putting in place an improved set of rules related to securitization. Over the intermediate term, we need to move more rapidly toward reform of the Basle capital standards, with a particular emphasis of dealing more effectively with the credit risk in the banking portfolio. This hopefully would reduce or eliminate the incentive to move low-risk assets out of the banking book and, in any case, permit capital charges to be set more systematically in relation to the risk in a given banking book. Because the markets are moving so rapidly, this has to be a high priority. Because these standards require international consensus and because the problem is complex and the solutions not entirely self-evident, the process of reform moves more slowly. But the time has come for putting this effort on the fast track. Financial Modernization: Updating the Legislative Framework As opposed to being on the fast track, I believe we can all agree that, in the United States, reform of our legal framework for banking regulation and supervision has been on the slow track. Indeed, much of our legal framework has essentially not changed since the 1930s. In the meantime, the distinctions between financial institutions has become increasingly blurred, as financial institutions have endeavored to broaden the activities they engage in to take advantage of synergies among the activities and to compete with other financial institutions. The result has been a relentless search for loopholes by the financial institutions and efforts by regulatory authorities to find ways within their charter to permit new activities in order to preserve the competitiveness of the institutions under their charge. This process is not only inefficient, but creates new inequities, institutions that may be producing unintended risks, and the misallocation of resources. That is why my colleagues and I are such strong supporters of H.R. 10, The Financial Modernization Act of 1998, which the House passed last month and the Senate Banking Committee approved in a modified form last week. Both bills bring our financial institutions into the 21st century in a framework that minimizes risks and inequities. The task it sets for itself is not easy. First and foremost the bills would finally let financial institutions get into each other’s businesses, and thus widen the scope and range over which institutions can compete for the public’s business. Mind you, this is not a statement that says financial supermarkets and/or large institutions will be better or more successful than specialized and/or smaller institutions. But the benefit is that the public, not regulators, will decide which will prosper as competitors all bend their efforts to serve the consumer. That is the bottom line. It’s the reason why there should be financial modernization. But the bills also establish a structure of bank supervision that is consistent both with efficient resource allocation and with minimizing risk to the stability of the economy and the taxpayer. The key elements to that structure are some residual limits on bank activities, limitations on the location within the banking organization where the new activities can be conducted, a blending of functional and umbrella supervision, and the continuation of the Federal Reserve’s role in the consolidated supervision of banking organizations. Both the House and the Senate versions would permit banks to conduct in their own subsidiaries (so-called operating subsidiaries or “op subs”) only the same activities that they may already conduct in the bank and financial agency activities, which by their nature require minimal funding and create minimal risk. These limitations, it seems to me, are crucial for several reasons. Banks have a lower cost of funds than other financial entities because of the safety net. As I discussed earlier, this federal safety net, and the subsidy that goes along with it, are provided by the Government in order to buy systemic stability. But it has a cost: increased risk taking by banks, reduced market discipline, and consequently the need for more onerous bank supervision in order to balance the resultant moral hazard. The last thing we should want is to extend that subsidy over a wider range of activities, which is, I believe, exactly what would happen if bank op subs could engage in wider nonbank financial activities. Not only would that increase the moral hazard -- and the need for bank-like supervision -- but it would also unbalance the competitive playing field between bank subs and independent firms engaging in the same business, a strange result for legislation whose ultimate purpose is to increase the competition for financial services. Both bills would require that organizations that conduct both banking and other financial businesses organize in a holding company form where the bank and the other activities are both subs of the holding company. Profits and losses of the business lines accrue to the holding company and thus do not directly benefit nor endanger the bank, the safety net, or the taxpayer. The safety net subsidy is not directly available to the holding company affiliates and competition is thus more balanced. Moreover, traditional regulators like the SEC and the state insurance commissioners still regulate the entities engaged in nonbank activities as if they were independent firms. Functional regulation is desirable not only for competitive equity, but is a political necessity and a practical reality in the process of balancing that is required to pass financial modernization. In principle, functional regulation could also be applied to op subs, but the safety net, I submit, would soon create regulatory conflict with that structure. Importantly, both bills would prohibit commercial affiliations with banks. There is no doubt that it is becoming increasingly difficult to draw a bright line that separates financial services from nonfinancial businesses; it will only become more difficult to do so. But, the truth is that we are not sure enough of the implications of combining banking and commerce -potential conflicts of interest, concentration of power, and safety net and stability concerns -- to move forward in this area. Better, I think, to digest financial reform before moving in an area that will be very difficult to reverse. Finally, the holding company framework of the bills would keep the Federal Reserve as the umbrella supervisor. I believe that the Fed has an important role to play in banking supervision in order to carry out its responsibilities for monetary policy, economic stabilization, and crisis management. I cannot grasp how we could possibly understand what is happening in banking markets, what innovations are occurring and their implications, and the nature and quality of the risk exposures and controls so critical for crisis management and policy formulation without the hands-on practical exposure that comes from supervision. An umbrella supervisor is needed for complex organizations in order to assure that the entire organization and its policies and controls are well managed and consistent with financial stability. At least for the large organizations, I believe that supervisor should be the Federal Reserve so that we can play our role as a central bank and international crisis manager. Conclusion The rapid changes in banking and financial services markets are creating opportunities and challenges. Increased competition in the financial services industry and increased synergies provided by financial services firms promise important benefits to consumers. But the increased size, breadth, complexity, and geographic scope of banking have increased the challenge of managing and of regulating and supervising banks. Banks have responded by developing new approaches to measuring and managing risk. Now it is time for regulatory standards and supervisory practice to catch up. This catching up must involve updating the legislative framework underlying banking. It must also involve updating the international capital standards that have been the foundation of the regulatory framework. And it may also be useful to reinforce the role of market discipline. Working together, the markets, the political process, and the regulators can ensure that we take advantage of the new opportunities while maintaining the safety and soundness of our banking system.
|
board of governors of the federal reserve system
| 1,998 | 9 |
Remarks by Mr. Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System, at the Federal Reserve Bank of Kansas City on 17/9/98.
|
Mr. Ferguson discusses the state of the US economy: near-term challenges and long-term changes Remarks by Mr. Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System, at the Federal Reserve Bank of Kansas City on 17/9/98. It is a pleasure to be here today to discuss the state of the economy, not least because -- despite some turbulence of late in financial markets -- there is still good news to report. We are now in the eighth year of economic expansion. Production, employment, and incomes have all been rising briskly, and the unemployment rate has declined to its lowest level since 1970. And at the same time, inflation has remained remarkably subdued. But the prudent central banker cannot afford to focus exclusively on the good performance of the past. He must constantly look as well to the future. In that spirit, today I shall not only discuss our current economic situation, but I shall also comment on several areas that I believe deserve particular attention in the period ahead. Some of these areas involve potential risks to our fine economic performance, and some involve potentially important structural changes to our economy. Before I begin, let me remind you that these views are personal and do not necessarily reflect the views of other members of the Federal Open Market Committee or the Board of Governors. Recent Economic Performance As you know, the United States has been experiencing a period of exceptionally strong economic growth. Real GDP rose nearly 4 percent over 1996 and again last year, the fastest rates of increase, two years in a row, in fifteen years. And activity maintained a rapid pace of expansion over the first half of this year as well, although growth apparently did step down some in the spring. This recent performance is especially impressive because it occurred in the face of a substantial reduction in export demand associated with the crisis in Asia. This powerful economic growth has benefited American businesses through soaring profits. It also has benefited our governments -- federal, state, and local -- through tax revenues that have come in above earlier expectations. And it has benefited American workers, through higher incomes and lower unemployment rates than almost anyone predicted. Our tight labor markets have had the tremendous dividend of giving job experience to people who otherwise would be on the margins of the economy -- experience that will serve these people well for many years to come. Tight labor markets are welcome, even to central bankers, where they can be sustained without producing cost and price pressures that will undermine prosperity. The driving force behind this sustained growth has been domestic demand, as reflected in enormous increases in both consumption and investment spending. Consumption spending has been fueled by rapid growth of employment and incomes. With some 6 million people added to payrolls over the past two years, and with compensation gains well in excess of inflation, consumers have had the means to boost their spending appreciably. And the more than doubling in the value of the stock market since late 1994 -- only a small fraction of which has been reversed in recent weeks, I might add -- has made people more comfortable about spending out of current income -- and probably out of accumulated assets, too. It should be no surprise that surveys that measure consumer confidence have recorded historic highs, nor that the personal saving rate has declined considerably. The same factors, along with the lowest mortgage rates in some time, have powered home sales to record levels. Favorable financial conditions, along with exceptionally large price declines for computer equipment, have made it relatively cheap for firms to invest in new plant and equipment. And with sales in a solid uptrend, this has produced a remarkable boom in capital spending. Business fixed investment was up nearly 10 percent last year, and it rose at a still faster pace in the first half of this year. Even more strikingly, investment has averaged 10 percent annual growth over the past five years, making this the most rapid sustained expansion of investment in thirty years. As I shall discuss shortly, this investment may provide reason for some to be optimistic about productivity growth in the period ahead. But, without doubt, it has played an important role in helping to boost aggregate demand in recent quarters. One important question looking forward is whether the recent rapid growth in demand can be sustained. I am not a forecaster, but it is important to know what the consensus among forecasters has been, and most have been expecting growth to slow from its recent pace. One reason forecasters have expected growth to slow is that production for inventories had been rising at an extremely rapid pace that was unlikely to be maintained. And indeed, inventory investment did drop down a good bit in the second quarter, contributing importantly to that quarter’s more modest GDP growth. Still, most analysts believe that the rate of stockbuilding -at least outside the motor vehicle sector -- probably remained unsustainable and can be expected to slow further. A second reason forecasters have expected a slowdown is that the impetus to consumer spending from annual gains of 25 to 30 percent in share values seemed unlikely to be sustained. This may seem obvious after last month’s drop in stock prices, but looking back to earlier in the year even the most optimistic forecasters were expecting stock market increases well short of the rates seen in 1996 and 1997. The impetus to spending from such modest increases in household wealth would be considerably smaller than was generated by last year’s outstanding run-up in share prices. And of course, last month’s decline may provide further reason to expect consumption growth to slow. International Risks A third important reason forecasters have expected slower growth ahead is the continuing troubles overseas. The problems faced by our Asian trading partners, including the persistent weakness of the Japanese economy, and the accompanying strength of the dollar, already have led to a sharp reduction in the demand for our exports. Indeed, in the first half of this year, the quantity of exports declined two quarters in a row for the first time since the mid-1980s, with shipments to Korea and Japan down sharply. Declines in exports of machinery, industrial supplies, and agricultural equipment were especially noteworthy. When we include the effect of rising imports as well, net exports subtracted more than 2 percentage points from GDP growth over this period, and our trade deficit on goods and services widened to about $175 billion at an annual rate in the second quarter. I should emphasize that not all of the effects on our economy of the Asian crisis have been adverse. Low prices this year of energy and other internationally traded commodities, including many agricultural commodities such as soybeans and wheat, have benefited much of our nation -- although certainly not our oil industry or our farmers. Those low prices reflect, in part, a decline in the demand from Asia. To put it another way, those nations exporting oil and other commodities have borne some of the brunt of the sagging Asian economies. Furthermore, the favorable financial conditions in this country may have been aided by a “flight to safety” that helped hold down our long-term interest rates and stimulate interest-sensitive sectors like housing. Thus, the seemingly fortuitous timing of extremely robust domestic demand that has offset the reduction in our export demand has not been entirely coincidental. As with any financial crisis, it is extremely difficult to predict what the future will bring for the troubled Asian countries. I remain guardedly optimistic that conditions will begin to stabilize in the coming months, and that export demand will start to firm at least a little next year. But that is by no means a certainty. One particular risk is that the problems that have thus far been most severe in Asia may become more severe in other regions of the world. We have already seen this in Russia, and financial difficulties in other countries as distinct as South Africa and Brazil emphasize the worries and uncertainties on this score. For a variety of reasons, there could be a more “contagion” to other countries. First, investors’ appetite for international risks has gone down following the Asian crises. This has led to trouble for countries that are running current account deficits and so are dependent on foreign capital. While most of Russia’s problems are home grown, for example, more cautious investors probably contributed to precipitating that country’s crisis. Indeed, over the last month it has become apparent that even US financial markets are not immune to perceptions that risk has increased. Second, as some emerging-market countries with fixed exchange rates respond to market pressures by relaxing their exchange-rate arrangements, those remaining countries attempting to peg their exchange rates may come under increasing speculative pressures. Third, countries that have close economic ties to the hardest-hit Asian nations certainly are feeling the repercussions of a reduction in demand for their exports, and a greater economic slowdown in Asia could trigger financial problems in those countries, too. And finally, as I noted earlier, countries that are heavy exporters of oil and other primary commodities have been hard-hit by low prices. In short, international problems in Asia and elsewhere are proving to be more profound and sustained than many, myself included, were guessing and hoping would be the case. Obviously, the effect of this drag on our ongoing economic expansion is an area that deserves close scrutiny in the months ahead. Inflation Developments But as I noted, the flip side of reduced export demand and a strong dollar is that the international situation is helping to hold down inflation here at home. This helps to explain the remarkable fact that our strong economic growth and low unemployment rates have coincided with exceptionally favorable news on the inflation front. I cannot overemphasize the importance of this story: rising inflation has sown the seeds of almost every cyclical downturn of the last half century. The consumer price index increased less than 2 percent over the past year. The last time consumer inflation was this low was in 1986 when, as in the present instance, consumers benefited from declining prices of energy products. But even when we exclude volatile food and energy items and focus on the so-called “core” CPI, inflation has been running not much above 2 percent, the lowest rate in more than three decades. And a broader measure of prices, the GDP price index, has increased only one percent over the past year, held down by rapid declines in the prices businesses pay for computers and communication equipment. This inflation performance has been better than most analysts had expected. But we should not exaggerate the extent of the surprise. True, most economists believe that the current unemployment rate of 4½ percent is below the level consistent with stable inflation, and indeed may already be exerting upward pressure on labor costs. But almost all sensible economists have long realized that unemployment is not the only influence on inflation. As I have emphasized, the strong dollar has led to falling prices for our imports, and low oil prices have led to low prices for gasoline and other energy products. Low oil prices also have helped hold down prices for petroleum-derived products like fertilizers and plastics, and they have reduced price pressures more generally by reducing firms’ utility and transportation costs. Furthermore, structural changes in the health care system have reduced the growth of firms’ health care costs. These factors have all contributed to our benign inflation performance. We will need to be vigilant because, just as these factors have worked to the benefit of low inflation recently, any or all of them may turn around and exert upward pressure on inflation in the future. Changes in the Labor Market One of the complexities in the inflation outlook involves changes that appear to be occurring in labor market practices. The extraordinary rise in labor demand and the accompanying tight labor markets have led to labor shortages in some parts of the country. Shortages have been reported for a variety of jobs, with the supply of computer professionals especially tight. To some extent, firms have been responding to these labor shortages by raising wages. And indeed, compensation increases have been growing in size for the past two to three years. But firms also have responded to tight labor markets in ways other than simply granting larger base wage increases. Firms have been making increasing use of various forms of targeted pay, such as hiring and retention bonuses, as a way to attract and retain certain key employees without granting a general wage increase. These targeted bonuses seem to be most prevalent for information processing workers, but I sense that they are reasonably widespread beyond that area as well. And these bonuses are often quite sizable. It is not uncommon to hear of hiring bonuses of five to ten percent of base salary, or higher. Firms are changing their compensation practices in other ways, too. Businesses increasingly are relying less on base wage and salary increases, and are substituting some form of variable pay that is tied directly to performance, such as annual bonuses and profit sharing plans, or even stock options that extend well beyond senior management. These changing labor market practices have implications for the functioning of the economy that are both interesting and, potentially, quite important. For one thing, many targeted bonuses may not be adequately captured by our aggregate wage statistics, so if these bonuses are becoming more prevalent, labor costs may be rising somewhat faster than the published data would indicate. Second, these practices may lead to changes in the cyclicality of firms’ compensation costs. On the one hand, if firms now are able to use targeted bonuses to certain workers in place of generalized wage increases, this could hold down compensation costs in tight labor markets, thereby making these costs less cyclical. On the other hand, increasing use of variable pay will tend to make compensation more cyclical than it otherwise would have been. Under a profit sharing system, for example, firms make larger compensation payments to their employees when times are good than they do when business is slack. If the latter effect were to dominate, and compensation were to become more cyclical, it would raise interesting questions about how this change might affect firms’ pricing behavior. To some extent, of course, firms probably smooth through the cyclical ups and downs in their costs when making price decisions in any case. But such smoothing probably is not perfect, and increases in fixed pay probably would exert some upward pressure on prices. Increased use of profit-sharing bonuses, however, would presumably lead firms to boost their employees’ compensation precisely when they can most afford to do so. And, conversely, when profits are being squeezed, the resulting decline in bonus payments would help ease the firm’s cost pressures. Thus, there may be reason to believe that increases in variable compensation payments might not lead to the same price pressures as would increases in fixed compensation payments. In other words, increased cyclicality of compensation gains might not imply any change in the cyclicality of price increases. Finally, the spread of variable pay could lead to changes in employment patterns. Pay that is more variable is more tied to profitability. As profitability declines, so will compensation, and it is possible that employment may not drop as much as it would in a world in which compensation is less closely tied to profits. In any event, it will be fascinating to watch these labor market developments unfold, and to try to sort out the implications for the macroeconomy. Of course, companies are not moving toward variable compensation schemes to alter the macro-dynamics of the economy. Firms are moving to variable compensation schemes because they hope these changes will enhance their profitability, in part by raising worker productivity. By giving workers a larger and more direct stake in the fortunes of the company, firms are providing incentives for their employees to work more efficiently and to suggest productivity-enhancing changes to the way products are made and business is conducted. It is hard to know how successful these efforts will ultimately prove to be. Many business people are convinced that variable compensation plans have paid off in terms of higher productivity. Many others are less confident, and say that they hope the changes are boosting productivity, but that it is very hard to pinpoint the sources of productivity improvements. This discussion points to one final aspect of our economic outlook that deserves mention, namely, productivity growth. In the long run, nothing is as important for our economic welfare as productivity growth, for this is what determines the pace of increase in living standards. And the recent productivity data have been impressive: output per hour worked rose 1¾ percent in 1997 after an advance of more than 2 percent the year before. The big question is whether this rapid productivity growth is temporary or is permanent -- that is, whether productivity has merely displayed its normal short-run response to a step-up in the growth of activity, or whether the faster pace can be sustained for some time even as output growth moderates. The only correct answer, of course, is that we don’t know yet. I can cite a few reasons to be very cautiously optimistic that productivity may be on a more favorable uptrend than it was in the 1980s. One reason is the changing incentives within the firm owing to variable pay that I just discussed, which at least some firms strongly believe has aided their productivity performance. A second reason for optimism is the investment boom that I discussed earlier. Economic history teaches us fairly convincingly that growth in the amount of capital per worker is an important determinant of gains in output per worker. Furthermore, some analysts have argued that the computerization of our economy is only now beginning to bear fruit, and that we may expect impressive productivity advances as people learn to use the new technology effectively. The economist Paul David notes that it took several decades following the initial development of the electric motor for companies to reorganize their production techniques to use the new invention efficiently. By analogy, David’s argument hints that the largest benefits of computerization may be yet to come. This analogy is intriguing, but counterarguments certainly can be made as well. One reason it took electric power so long to yield benefits is that it took many years for the new technology to gain widespread use, in part because of the huge expenditures required to reconfigure manufacturing plants to use electric power. By contrast, the transition from older office equipment to mainframe computers, and then to desktop machines, often entails much smaller adjustment costs and so has been comparatively rapid. Furthermore, it is easy to forget that, while computers account for a large share of investment, they account for only a small share of the overall capital stock, in part reflecting their rapid rate of depreciation. Thus, unless computers earn a considerably higher return than other investments that firms might make, the small share imposes some limits on the contribution computers could make to overall productivity growth. So, while we can be hopeful that the recent productivity increases may represent a faster long-term trend, we should not count our chickens too soon. But you certainly can add productivity to the list of items I will be following closely in the period ahead. Conclusion With strong output growth, low unemployment, low inflation, and rapid productivity growth, these have certainly been good times to be a central banker. But this does not mean that they are not interesting times, as a glance at the financial pages will drive home. Although our economy is basically very strong, we also face near-term and longer-term challenges, and I hope I have given you a sense of some of the issues that will be on my mind as I try to gauge our nation’s economic performance in the period ahead.
|
board of governors of the federal reserve system
| 1,998 | 9 |
Testimony by the Chairman of the Board of Governors of the US Federal Reserve System, Mr. Alan Greenspan, before the Committee on Banking and Financial Services of the US House of Representatives on 1/10/98.
|
Mr. Greenspan testifies on private-sector refinancing of the large hedge fund, Long-Term Capital Management Testimony by the Chairman of the Board of Governors of the US Federal Reserve System, Mr. Alan Greenspan, before the Committee on Banking and Financial Services of the US House of Representatives on 1/10/98. Mr. Chairman and other members of the Committee, I thank you for this opportunity to report on the Federal Reserve’s role in facilitating the private-sector refinancing of the large hedge fund, Long-Term Capital Management (LTCM). In my remarks this morning, I will attempt to put into some perspective the events of the past few weeks and discuss some questions of importance to public policy makers that they raise. The Federal Reserve Bank of New York’s efforts were designed solely to enhance the probability of an orderly private-sector adjustment, not to dictate the path that adjustment would take. As President McDonough just related, no Federal Reserve funds were put at risk, no promises were made by the Federal Reserve, and no individual firms were pressured to participate. Officials of the Federal Reserve Bank of New York facilitated discussions in which the private parties arrived at an agreement that both served their mutual self-interest and avoided possible serious market dislocations. Financial market participants were already unsettled by recent global events. Had the failure of LTCM triggered the seizing up of markets, substantial damage could have been inflicted on many market participants, including some not directly involved with the firm, and could have potentially impaired the economies of many nations, including our own. With credit spreads already elevated and the market prices of risky assets under considerable downward pressure, Federal Reserve officials moved more quickly to provide their good offices to help resolve the affairs of LTCM than would have been the case in more normal times. In effect, the threshold of action was lowered by the knowledge that markets had recently become fragile. Moreover, our sense was that the consequences of a fire sale triggered by cross-default clauses, should LTCM fail on some of its obligations, risked a severe drying up of market liquidity. The plight of LTCM might scarcely have caused a ripple in financial markets or among federal regulators 18 months ago--but in current circumstances it was judged to warrant attention. What is remarkable is not this episode, but the relative absence of such examples over the past five years. Dynamic markets periodically engender large defaults. Events of the Past Few Weeks LTCM is a hedge fund, or a mutual fund that is structured to avoid regulation by limiting its clientele to a small number of highly sophisticated, very wealthy individuals and that seeks high rates of return by investing and trading in a variety of financial instruments. Since its founding in 1994, LTCM has had a prominent position in the community of hedge funds, in part because of its assemblage of talent in pricing and trading financial instruments, as well as its large initial capital stake. In its first few years of business, it earned an enviable reputation by racking up a string of above-normal returns for its investors. LTCM appears principally to have garnered those returns by making judgements on interest rate spreads and the volatility of market prices. In its search for high return, LTCM levered its capital through securities repurchase contracts and derivatives transactions, relying on sophisticated mathematical models of behavior to guide those transactions. As long as the configuration of returns generally mimicked their historical patterns, LTCM’s mathematical models of asset pricing could be used to ferret out temporary market price anomalies. Their trading both closed such price gaps and earned an extra bit of return on capital for them. But it is the nature of the competitive process driving financial innovation that such techniques would be emulated, making it ever more difficult to find market anomalies that provided shareholders with a high return. Indeed, the very efficiencies that LTCM and its competitors brought to the overall financial system gradually reduced the opportunities for above-normal profits. Indeed, LTCM acknowledged this when returning $2¾ billion of capital to investors at the end of 1997. To counter these diminishing opportunities, LTCM apparently reached further for return over time by employing more leverage and increasing its exposure to risk, a strategy that was destined to fail. Unfortunately for its shareholders, LTCM chose this exposure just as financial market uncertainty and investor risk aversion began to rise rapidly around the world. In that environment - so at variance with the experience built into its models LTCM’s embrace of risk on a large scale produced stunning losses. As we now know, by the end of August the firm had lost half its capital base. And as September unfolded, the bleeding continued. The firm, however, apparently did not unwind its positions significantly. In our dynamic market economy, investors and traders, at times, make misjudgements. When market prices and interest rates adjust promptly to evidence of such mistakes, their consequences are generally felt mostly by the perpetrators and, thus, rarely cumulate to pose significant problems for the financial system as a whole. Indeed, the operation of an effective market economy necessitates that investment funds committed to capital projects that do not accurately reflect consumer and business preferences should incur losses and ultimately be liquidated. What value is left needs to be redirected to profitable uses - those that more accurately reflect market preferences. By such winnowing of inefficiencies, productivity is enhanced and standards of livings expand over time. Financial markets operate efficiently only when participants can commit to transactions with reasonable confidence that the risk of non-payment can be rationally judged and compensated for. Effective and seasoned markets pass this test almost all of the time. On rare occasions, they do not. Fear, whether irrational or otherwise, grips participants and they unthinkingly disengage from risky assets in favor of those providing safety and liquidity. The subtle distinctions that investors make, so critical to the effective operation of financial markets, are abandoned. Assets, good and bad, are dumped indiscriminately in circumstances of high uncertainty and fear that are not conducive to planning and investment. Such circumstances, were they generalized and persistent, would be wholly inconsistent with the functioning of sophisticated economies supported by long-term capital investment. Quickly unwinding a complicated portfolio that contains exposure to all manner of risks, such as that of LTCM, in such market conditions amounts to conducting a fire sale. The prices received in a time of stress do not reflect longer-run potential, adding to the losses incurred. Of course, a fire sale that transfers wealth from one set of sophisticated market players to another, without any impact on the financial system overall, should not be a concern for the central bank. Moreover, creditors should reasonably be expected to put some weight on the possibility of a large market swing when making their risk assessments. Indeed, when we examine banks we expect them to have systems in place that take account of outsized market moves. However, a fire sale may be sufficiently intense and widespread that it seriously distorts markets and elevates uncertainty enough to impair the overall functioning of the economy. Sophisticated economic systems cannot thrive in such an atmosphere.1 The scale and scope of LTCM’s operations, which encompassed many markets, maturities, and currencies and often relied on instruments that were thinly traded and had prices that were not continuously quoted, made it exceptionally difficult to predict the broader ramifications of attempting to close out its positions precipitately. That its mistakes should be unwound and losses incurred was never open to question. How they should be unwound and when those losses incurred so as to foster the continued smooth operation of financial markets was much more difficult to assess. The price gyrations that would have evolved from a fire sale would have reflected fear-driven judgements that could only impair effective market functioning and generate losses for innocent bystanders. While the principle that fire sales undermine the effective functioning of markets may be clear, deciding when a potential market disruption rises to a level of seriousness warranting central bank involvement is among the most difficult judgements that ever confronts a central banker. In situations like this, there is no reason for central bank involvement unless there is a substantial probability that a fire sale would result in severe, widespread, and prolonged disruptions to financial market activity. It was the judgement of officials at the Federal Reserve Bank of New York, who were monitoring the situation on an ongoing basis, that the act of unwinding LTCM’s portfolio in a forced liquidation would not only have a significant distorting impact on market prices but also in the process could produce large losses, or worse, for a number of creditors and counterparties, and for other market participants who were not directly involved with LTCM. In that environment, it was the FRBNY’s judgement that it was to the advantage of all parties including the creditors and other market participants - to engender if at all possible an orderly resolution rather than let the firm go into disorderly fire-sale liquidation following a set of cascading cross defaults. As President McDonough has detailed, officers of the Federal Reserve Bank of New York contacted a number of creditors and asked if there were alternatives to forcing the firm into bankruptcy. At the same time, FRBNY officers informed some of their colleagues at the Federal Reserve Board, the Treasury, and other financial regulators of their ongoing activities. The troubles of LTCM were not a complete surprise to its counterparties. After all, LTCM’s earlier statements regarding its August losses were well known, and sophisticated counterparties understood the difficulties in closing out large losing positions. In addition, the commercial banks among its creditors had already begun taking normal precautionary measures associated with exposure to counterparties whose condition is deteriorating. Still, creditors as a whole most likely underestimated the size and scope of the market bets that LTCM was undertaking, an issue that is currently under review. On September 23, the private sector parties arrived at an agreement providing a capital infusion of about $3½ billion in return for substantially diluting existing shareholders’ stake in LTCM. Control of the firm passed from the current management to a committee determined from the outside by the new investors. Those investors intend to shrink LTCM’s portfolio so as to reduce risk of loss and return the remaining capital to the investors as soon as practicable. I do not rule out the possibility that the new owners of what is left of LTCM may decide to keep part of it in business. That is their judgement to make. This agreement was not a government bailout, in that Federal Reserve funds were neither provided nor ever even suggested. Agreements were not forced upon unwilling market participants. Creditors and counterparties calculated that LTCM and, accordingly, their claims, would be worth more over time if the liquidation of LTCM’s portfolio was orderly as opposed to being subject to a fire sale. And with markets currently volatile and investors skittish, putting a special premium on the timely resolution of LTCM’s problems seemed entirely appropriate as a matter of public policy. Of course, any time that there is public involvement that softens the blow of private-sector losses - even as obliquely as in this episode - the issue of moral hazard arises. Any action by the government that prevents some of the negative consequences to the private sector of the mistakes it makes raises the threshold of risks market participants will presumably subsequently choose to take. Over time, economic efficiency will be impaired as some uneconomic investments are undertaken under the implicit assumption that possible losses may be borne by the government. But is much moral hazard created by aborting fire sales? To be sure, investors wiped out in a fire sale will clearly be less risk prone than if their mistakes were unwound in a more orderly fashion. But is the broader market well served if the resulting fear and other irrational judgements govern the degree of risk participants are subsequently willing to incur? Risk taking is a necessary condition for wealth creation. The optimum degree of risk aversion should be governed by rational judgements about the market place, not the fear flowing from fire sales. The Federal Reserve provided its good offices to LTCM’s creditors, not to protect LTCM’s investors, creditors, or managers from loss, but to avoid the distortions to market processes caused by a fire-sale liquidation and the consequent spreading of those distortions through contagion. To be sure, this may well work to reduce the ultimate losses to the original owners of LTCM, but that was a by-product, perhaps unfortunate, of the process. I should add that, in order to keep incentives working in their favor, the creditors of LTCM apparently also understood the importance of some cushioning of the losses to the owners and managers of the firm. The private creditors and counterparties in the rescue package chose to preserve a sliver of equity for the original owners - one tenth - so that some of the management would have an incentive to stay with the firm to assist in the liquidation of the portfolio. Regrettably, the creditors felt that, given the complexity of market bets woven into a bewildering array of financial contracts, working with the existing management would be far easier than starting from scratch. Some Questions for Policy Makers Without doubt, extensive study will be required to put the events of the past few weeks into proper perspective. As a member of the President’s Working Group on Financial Markets, I support Secretary Rubin’s call for a special study on the public policy implications of hedge funds. While the affairs of LTCM are by no means settled, I would like to pose some tentative questions that may have to be addressed. First, how much dependence should be placed on financial modelling, which, for all its sophistication, can get too far ahead of human judgement? This decade is strewn with examples of bright people who thought they had built a better mousetrap that could consistently extract an abnormal return from financial markets. Some succeed for a time. But while there may occasionally be misconfigurations among market prices that allow abnormal returns, they do not persist. Indeed, efforts to take advantage of such misalignments force prices into better alignment and are soon emulated by competitors, further narrowing, or eliminating, any gaps. No matter how skilful the trading scheme, over the long haul, abnormal returns are sustained only through abnormal exposure to risk. Second, what steps could counterparties have taken to ensure that they had properly estimated their exposure, particularly in markets that are volatile? To an important degree, the creditors of LTCM were induced to infuse capital into the firm because they failed to stress test their counterparty exposures adequately and therefore underestimated the size of the uncollateralized exposure that they could face in volatile and illiquid markets. In part, this also reflected an underappreciation of the volume and nature of the risks LTCM had undertaken and its relative size in the overall market. By failing to make those determinations, its fellow market participants failed to put an adequate brake on LTCM’s use of leverage. To be sure, sometimes decisions are based on judgements about the soundness of borrowers that are accepted from third parties or, possibly in this case, that are founded on the impressive qualifications of LTCM’s principals. In some cases, such truncated risk appraisals may be accurate, but they are not a substitute for a rigorous analysis by the lender of the borrower’s overall credit worthiness and risk profile. Third, in this regard what lessons are there for bank regulators? Domestic commercial bank exposure to LTCM included both direct lending and acting as counterparties to the firm in derivatives contracts. A preliminary review of bank dealings with LTCM suggests that the banks have collateral adequate to cover most of their current mark-to-market exposures with LTCM. The unexpected surge in risk aversion and the dramatic opening up of interest rate spreads in August obviously caught LTCM wrong footed. Counterparties, including banks, continued to collect collateral for marks to market. What they were not collateralized against was the losses that might have occurred when prices moved even further and market liquidity dried up in a fire sale. Supervisors of banks and security firms must assess whether current procedures regarding stress testing and counterparty assessment could have been improved to enable counterparties to take steps to insulate themselves better from LTCM’s debacle. More important will be the assessment of whether those procedures are adequate for the future. But this is an area in which much work has been ongoing. During the fourth quarter of 1997 and the first quarter of 1998, supervision staff of the Federal Reserve Bank of New York and the Board met with managers at several major New York banking institutions to discuss their current relationships with hedge funds, updating a similar study conducted 3½ years earlier. Fourth, does the fact that investors have lost most of their capital and creditors may take some losses on their exposure to LTCM call for direct regulation of hedge funds? It is questionable whether hedge funds can be effectively directly regulated in the United States alone. While their financial clout may be large, hedge funds’ physical presence is small. Given the amazing communication capabilities available virtually around the globe, trades can be initiated from almost any location. Indeed, most hedge funds are only a short step from cyberspace. Any direct US regulations restricting their flexibility will doubtless induce the more aggressive funds to emigrate from under our jurisdiction. The best we can do in my judgment is what we do today: Regulate them indirectly through the regulation of the sources of their funds. We are thus able to monitor far better hedge funds’ activity, especially as they influence US financial markets. If the funds move abroad, our oversight will diminish. In the first line of risk defense, if I may put it that way, are hedge funds’ lenders and counterparties. Commercial and investment banks especially have the analytic skills to judge the degree of risk to which the funds are exposed. Their self-interest has, with few exceptions but including the one we are discussing today, controlled the risk posed by hedge funds. Banking supervisors are the second line of risk defense in their examination of lending procedures for safety and soundness. We neither try, nor should we endeavor, to micro-manage bank lending activity. We have nonetheless built up significant capabilities in evaluating the complex lending practices in OTC derivatives markets and hedge funds. If, somehow, hedge funds were barred world-wide, the American financial system would lose the benefits conveyed by their efforts, including arbitraging price differentials away. The resulting loss in efficiency and contribution to financial value added and the nation’s standard of living would be a high price to pay - to my mind, too high a price. Fifth, how much weight should concerns about moral hazard be given when designing mechanisms for governmental regulation of markets? By way of example, we should note that were banks required by the market, or their regulator, to hold 40 percent capital against assets as they did after the Civil War, there would, of course, be far less moral hazard and far fewer instances of fire-sale market disruptions. At the same time, far fewer banks would be profitable, the degree of financial intermediation less, capital would be more costly, and the level of output and standards of living decidedly lower. Our current economy, with its wide financial safety net, fiat money, and highly leveraged financial institutions, has been a conscious choice of the American people since the 1930s. We do not have the choice of accepting the benefits of the current system without its costs. Conclusion For so long as there have been financial markets, participants have had on occasion to weigh the costs and, especially, the externalities associated with fire-sale liquidations of troubled entities against short-term assistance to tide the firms over for a time. It was such a balancing of near-term costs and longer-term benefits that presumably led J.P. Morgan to convene the leading bankers of his age - both commercial and investment - in his library in 1907 to address the severe panic of that year. Such episodes were recognized as among those rare occasions when otherwise highly effective markets seize up and temporary ad hoc responses were required. The convening of LTCM investors and lenders last week at the Federal Reserve Bank of New York could be viewed in that long tradition. It should similarly be viewed as a rare occasion, warranted because of the potential for serious disruptions to markets. We must also remain mindful where to draw the line at which public-sector involvement ends. The efforts last week were limited to facilitating a private-sector agreement and had no implications for Federal Reserve resources or policies. Footnotes 1 At the same time, not all fire sales are without merit. The Resolution Trust Corporation earlier this decade chose to offer commercial real estate in what might be termed a fire sale because it was the only way an otherwise seized-up market could be galvanized. Some level of market prices had to be established - even if below “intrinsic” or longer-run value in order to re-establish a two-way market. This was a special case.
|
board of governors of the federal reserve system
| 1,998 | 10 |
Remarks by Mr. Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System, at the Bank Administration Institute's Symposium on Payments System Strategy held in Washington, D.C. on 29/9/98.
|
Mr. Ferguson discusses the consumer side of electronic banking and its implications for Federal Reserve policy development Remarks by Mr. Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System, at the Bank Administration Institute’s Symposium on Payments System Strategy held in Washington, D.C. on 29/9/98. Electronic Banking: Where are the customers? What do they think? What does it mean for the Federal Reserve? It is my pleasure to be here this afternoon, and I want to thank the BAI for inviting me to speak to this symposium on payments system strategy. The focus of my remarks today will be on the interaction between electronic banking and the household sector, how this interaction should influence our “strategic” thinking about the evolution of the demand for electronic banking by households, and what this may mean for the strategies the Federal Reserve should adopt in developing policy in the payments area, and in other areas as well. Some of my comments will, by necessity, by quite speculative, and perhaps even somewhat controversial. But my desire is to contribute positively to efforts by all market participants to understand and adapt to the rapidly changing and highly uncertain world of electronic commerce. What is Electronic Banking? I will begin by defining what I mean by electronic banking. I would propose that we use a very broad definition. Thus, at one end I would include telephone banking, credit cards, ATMs, and direct deposit, all of which are fairly mature and familiar products, but ones that are certainly electronically based. At the other end are stored-value cards and Internet-based stored value products, products that are still mostly in the experimental stage. In the middle are debit cards, a product that has been around for a number of years but which only recently began to achieve wide market penetration, electronic bill payment innovations, and PC banking, a newer product that is becoming increasingly robust and sophisticated. Defining electronic banking in this way has many advantages, not the least of which is that it helps us to see electronic banking as a continuum that has, in fact, been evolving for quite some time. An implication of this perspective is that lessons learned from past electronic developments may help us better understand both today and the future. Who uses Electronic Banking? In the Federal Reserve’s 1995 Survey of Consumer Finances of over 4,000 American households, we asked a series of questions regarding households’ use of electronic and other banking products. Our intent was to establish some benchmark data for 1995, which could then be used with data from future surveys to track and analyze electronic banking. So, in an effort to establish some benchmarks, let me summarize briefly what reality appeared to be in 1995. Our survey results indicated that, in 1995, use of an electronic technology to transact at a financial institution was common, but hardly the dominant form of conducting business. Ignoring credit cards, which were widely used, the most commonly used instrument was direct deposit, a relatively old and well-established electronic technology. While an estimated 50 percent of households used direct deposit, this usage rate paled compared to that of the most commonly used means for transacting at a financial institution. The most common means was the “in-person” visit, employed by 87 percent of households. Indeed, the second most popular technology was the mail, used by an estimated 57 percent of households. Other electronic technologies, including even the telephone and ATMs, were employed by substantially smaller portions of the population. Stored-value cards did not even show up in numbers sufficient to estimate reliably a national percentage, and use of the computer for transacting at a financial institution was scarcely out of its diapers as far as being a technology that households actually used. Still, some form of electronic technology was utilized by almost 70 percent of American households, even excluding credit cards from the calculation. Perhaps the most interesting and important results from our 1995 survey concerned the importance of education, income, financial assets, and age for the use of electronic banking products by US households. Indeed, when trying to describe who used electronic banking it seemed difficult to underestimate the importance of education, even after controlling for factors such as income and age. An important break point appeared to be achieving at least a college degree, a level of education held by the heads of less than one-third of households. Households with annual incomes below $25,000 were particularly unlikely to use electronics, and households with annual incomes above $50,000 appeared relatively likely to do so. In addition, households with heads under the age of 35 were considerably more likely to use ATMs, debit cards, and the computer. The only use of electronic technology that increased with age was direct deposit, a reflection of the importance of the direct deposit of Social Security payments. On balance, these results suggested that, in 1995, the potential market for relatively new electronic banking products was highly specialized. Since 1995 the world of electronic banking has certainly changed. But how much has it changed? Unfortunately, we do not seem to have a very clear picture of the overall trends. As I am sure each of you is well aware, estimates of the current state-of-the-world, and projections of the future state-of-the-world are all over the map. Use of debit cards is on the rise, and continued growth is frequently predicted. An increasing number of banks are providing PC banking services and it will be interesting to see whether new product offerings currently being developed will substantially increase market penetration here, particularly as household PCs with modems become more ubiquitous. Experience with stored-value cards has not come close to meeting the expectations that some had several years ago. Most stored-value card pilots seem to suggest that, while the technology works, consumer and business demand is weak. Recent Federal Reserve research on the use of the ACH shows that while ACH volume continues to increase at double-digit rates, a significant untapped market remains. More than half of households surveyed earlier this year receive one or more payments via direct deposit and about one-third of all businesses surveyed offer this payment option (with almost three-quarters of medium and large companies offering it). More than one-third of households surveyed use ACH direct payment and only 13 percent of businesses offer this payment option (although almost three-quarters of utility companies offer it). Users of both direct deposit and direct debit report very high satisfaction levels; nonusers cite more difficult problem resolution and lack of availability. In addition, nonusers of direct payment expressed concern that this option would diminish their control over payments, as well as reduce privacy and security. It is interesting to note that when one looks at the incidence of consumer complaints received by the Federal Reserve Board, electronic banking does not stand out as a troublesome area. For example, through July of this year, only two percent of the total complaints involved electronic funds transfer transactions. For the entire year of 1997, only three and one-half percent of all complaints were so directed. What Might the Future Hold? So, what, on balance, should we expect in the future? It is useful to consider this question in the context of both the typical product life cycle as well as in relation to the several key factors for the success of new payment services. Some retail payment services, such as stored value products, are at the beginning of the product life cycle, and it is not clear which ones will succeed and eventually have mass acceptance and which will fall by the wayside. PC bill payment is somewhat farther along the life cycle curve, but still has relatively low market penetration. The most interesting group of technologies, in my judgment, are those which have recently moved out of their limited niche markets into mass adoption. A short list would include the Internet, which is a whole new distribution system in itself; debit cards, which use existing credit card and regional ATM networks; and on-line investing, which relies on the Internet. The Internet really only blossomed into mass adoption in the mid to late 1990s with the advent of numerous simple tools for “surfing the web.” On-line investing has just begun to push into early mass adoption. Some apparent reasons for its success include our long economic boom, a graying population looking towards retirement, significantly lower transaction costs, and the provision of associated investor information and services. Finally, heavily marketed off-line debit cards that provide consumers with global merchant access and the convenience of a credit card have fueled debit’s sharp volume gains. As our research indicates, one electronic banking service that has moved well into mass adoption is direct deposit. Users value the convenience, reliability, and security of this means of payment. Credit cards have moved even further along in the product life cycle. This brings me to two key factors underlying the successful adoption of new payments method. The first key factor, as we have seen with direct deposit, ATMs, debit cards, and on-line investing, is that reaching a critical customer mass is highly dependent on a comprehensive infrastructure. There must be a distribution system that saturates the target market and has resolved most “soft” infrastructure issues--standards, protocols, and legal framework-before widespread consumer availability will occur. Debit cards and on-line investing have taken this a step further to reduce the time from product introduction to early mass adoption by piggybacking upon existing networks. These infrastructure issues are critical because having electronic banking products that customers can use is the obvious first step in having products that customers will use. Similarly, resolving some of these “soft” infrastructure questions alters the balance in risk-reward calculation that providers must make in deciding whether to pursue new product introductions. Second, for customer preferences to shift to a new payment mechanism, customers must perceive that, on net, the new mechanism is more advantageous to them than existing alternatives. Is the new product more convenient to use? More secure? Less expensive? Does it facilitate better recordkeeping? Before customers try a new payment product, they may not be in a position to compare fully how it stacks up with respect to these attributes against the payment method they currently use. Unfortunately, in many cases they cannot easily compare the relative cost of the new payment method with an existing method. In this regard, I find it somewhat puzzling that some banks that advocate the migration of retail payments from paper to electronics nonetheless have pricing structures that appear to run counter to that objective. Competitive and Market Issues Raised by Payments System Developments The evolution of the retail payments system raises several interesting issues related to competition. First is the tension between cooperation and competition among providers. Cooperation may help ensure interoperability across products of competing service providers and increase providers’ willingness to accept new technologies and payment system innovations, which can reduce the risk and cost for each participant. Competition without coordination among providers may provide the market with a broader choice of payment innovations, but may delay user acceptance and the associated economies of scale and lower prices that may result from volume growth. A related issue is how the evolution of retail payments and the increasing reliance on electronics will affect the competitive balance between small and large financial institutions. To date, small institutions have proved very adept at maintaining their competitive viability. The effects of the widespread use of some of the more recent innovations in electronic banking, however, are unclear. For example, PC banking may free smaller institutions from a heavy dependence on “bricks and mortar,” and thus greatly expand their geographic reach. Alternatively, it may be that large institutions will have an advantage in being able to support the technical overhead and manage the security concerns required to offer a wide range of banking services on the Internet. Electronic banking may also have implications for Federal Reserve policy in several arenas outside of the payments system. While it is too early to say with any certainty what the implications of electronic banking will be, and my comments must therefore be taken as highly speculative, I would like to give you a feel for my thinking by suggesting two areas that I believe are appropriate subjects for strategic thinking. Electronic banking may prove to be important in the analysis of the competitive implications of bank mergers and acquisitions. A cornerstone of our current approach for enforcing the antitrust laws is the definition of a local geographic market for the cluster of bank products and services - primarily insured accounts and loans and transaction services to small businesses. On-line banking conducted via a household’s or a small business’ personal computer has obvious possible implications for the reasonableness of using a local geographic market. While current usage of PC banking is too small to affect our methodology significantly, in my judgment this is clearly an area that we need to monitor and study, and, if necessary, be prepared to adapt our procedures accordingly. In this regard, I would note that good data on the use of PC banking by households and small businesses is a must. Electronic banking raises similar issues for our evaluation of the Community Reinvestment Act performance of banks. Here again, a fundamental aspect of our current approach is definition of the relevant geographic area. These are complex topics and the jury is still clearly out, but, in my view, they are areas that deserve careful watching. Federal Reserve Payments System Role Clearly, an efficient and smoothly functioning payments system is critical to the health and stability of the financial markets and the economy more broadly. Although there is surely room for continuing improvement, the U.S. payments system fares quite well when compared to other countries. As the central bank, the Federal Reserve has a keen interest in fostering continuing improvements in the efficiency and integrity of the U.S. payments system. We work to achieve this objective through several important payments system roles, those of regulator, service provider, and facilitator. As regulator, we are always open to your suggestions for revisiting regulations that may pose impediments to payments system innovations. In considering changes to existing regulations, we would assess how the change might affect the public good. For example, is the original public policy rationale for the regulation no longer fully valid? Would the proposed change make society better off? As service provider, the Federal Reserve is working closely with the industry to enhance the ACH marketing and education that is needed to substantially increase market penetration. The market research that I discussed earlier is important in helping to focus these efforts. In addition, the Reserve Banks are providing ACH EDI transaction capability to all customers to facilitate the use of the ACH for vendor payments. In the check service, the Reserve Banks are presenting an increasing number of checks electronically (ECP currently represents about 14 percent of Fed check collection volume), are introducing a wider array of check image services, and are developing pilot programs to learn more about ECP. To facilitate the settlement of transactions that clear outside the Federal Reserve, the Reserve Banks will soon be introducing an enhanced net settlement service, which provides better finality, security, and risk controls than the net settlement service used by most private clearing arrangements today. Finally, as facilitator, we are eager to hear about your efforts, both the successes and the failures, and to learn what consumers think. If we are to make informed decisions on policy and regulatory issues in this rapidly changing environment, we must have a sound understanding of market developments and the implications of policy choices. I do not believe the Federal Reserve should dictate or promote any specific “vision” of the future payments system, particularly with respect to the small-dollar retail payment mechanisms. The marketplace will be the ultimate judge of the future direction of the payments system. We should understand, however, the payment mechanism attributes customers are demanding, and the products and services you are developing to meet that demand, so that we can shape policies in the context of market realities. Conclusion In closing, I hope that my discussion has provided some additional insight into the consumer side of electronic banking and its implications for Federal Reserve policy development. I would like to again extend my personal welcome for your suggestions on how to improve the climate for payments system innovation, and to share with us your experiences with electronic banking. The insights that we obtain from an ongoing dialogue with industry representatives need to be augmented with further research in this area. In this regard, the Federal Reserve plans to undertake a major research effort in 1999 that will include surveys of consumers, businesses, and depository institutions about the use of retail payments. The surveys will gather data on the use of various retail payment instruments and the use of technology to facilitate retail payments, and will also explore public perceptions about retail payment instruments. We look forward to sharing the results of these surveys with you. They should provide both the Federal Reserve and the private sector with a much broader base of information from which to evaluate policies and initiatives.
|
board of governors of the federal reserve system
| 1,998 | 10 |
Remarks by the Vice-Chair of the US Federal Reserve System, Ms. Alice M. Rivlin, at the Bank Administration Institute's Symposium on Payments System Strategy in Washington, D.C. on 9/28/98.
|
Ms. Rivlin discusses the Federal Reserve’s role in the payments system Remarks by the Vice-Chair of the US Federal Reserve System, Ms. Alice M. Rivlin, at the Bank Administration Institute’s Symposium on Payments System Strategy in Washington, D.C. on 9/28/98. The Federal Reserve’s Roles in the Payments System This is an exciting time to have a conference on payment system strategies, because there are real opportunities for rapid change over the next few years in the way we make payments, especially retail payments in the United States. You will be discussing some of those opportunities and possibilities over the next few days with experts on the forefront of change. You will hear presentations that describe new worlds in which payments are made more efficiently, more conveniently, with greater security and less fraud than they are now--not just by giant corporations and sophisticated techies, but by ordinary folks and average enterprises going about their daily activities. But while there are opportunities and possibilities, there are also enormous uncertainties, both about the direction that technological change will take and about the willingness and ability of individuals and institutions to embrace these changes. It is certainly possible that a decade from now we will have a radically different retail payments system in the United States than we have now. It could be a system that depends very little on paper or on land and air transportation of paper messages. It could instead be one in which both consumers and businesses, not to mention governments, make almost all of their payments by electronic means from wherever they happen to be. On the other hand, it is also possible that not much will happen to retail payments over the next decade or at least that change will not accelerate from its current pace. Some of the people in this room could be back here in ten years assessing why the promising innovations of the late 1990’s never came into general use, and why, although credit, debit and automated clearinghouse (ACH) payments continued to expand rapidly, billions of paper checks were still being written and physically transported for presentment. In part - although only in part - the outcome depends on us, on the people in this room, and the enterprises and institutions we represent. It depends on how effectively and imaginatively we all do our individual jobs and on whether we find constructive ways to work together. I happen to be one of those who thinks change in retail payments could accelerate quite rapidly in the next few years and that paper check volume could begin to decline steeply in the relatively near term. I say could rather than will because I realize that prediction has been made many times before and has always proved wrong. I guess like so many others I am just struck by the absurdity of moving so many little pieces of paper around the country when the messages could be sent so much more efficiently by electronic means. Every time I visit a Reserve Bank and watch those high speed sorters working so hard and the trucks coming in and out to the airplanes, I think, “This is a really impressive, fast and cost-effective way of moving paper, but why are we moving paper?” The “we,” of course, is not just the Federal Reserve, but all those banks and clearing houses and other participants in the system. And the answer to why we are all moving all this paper is that customers want us to move it and are willing to pay to have it done. But the answer still doesn’t make a whole lot of sense. There are several interrelated reasons why change could be expected to accelerate in the retail payments arena in the near future. • First, of course, is the rapidity with which technology is evolving and costs are dropping in computing and telecommunications. Many types of information, including payments information, can be transmitted much faster and cheaper than was possible even a year or two ago, and there seems to be no end to the process in sight. But the avalanche of technological progress may actually be holding up a great leap forward in the efficiency of retail payments. Because the network effect is important and the players who have to commit to a new technology to make it viable are quite numerous, change may come slowly at first. Potential players may be confused by the multiplicity of options and reluctant to commit if they think it likely that something faster, cheaper and more convenient may come along soon. It’s like hanging on to your old television for a while in hopes that the next generation of models will give you a lot more definition for less money. • A second reason is that the restructuring of the whole financial services industry across geographic and product lines is providing a huge opportunity for major institutions to rethink how they transmit information and make and receive payments. This restructuring, combined with the onrush of technology and the heavy emphasis across all business sectors on competitiveness, cost reduction and efficiency, could accelerate the substitution of electronic for paper based payments at the consumer as well as business level. But in the short-run, restructuring, like rapid technological change, may actually slow things down just because information technology people and their top bosses simply have too much to think about at once. Coping with mergers, the year 2000, and the transition to the Euro -- not to mention wild swings in the financial markets -- may not allow radical changes in payments technology to get much attention in the next year or two. • A third reason to expect acceleration of change in retail payments is the explosive growth of the Internet and its increasing use for shopping and business transactions. The number of consumers and small businesses using the Internet routinely for transactions that require payments could skyrocket if several different types of industries move in this direction at once. ◊ It could quickly become “normal,” not only to shop for mortgage rates on the Internet, but to apply for mortgage loans and get approved over the Internet as well. ◊ It could also become normal for potential college students of all ages to shop for colleges or course offerings over the Internet and to apply and be accepted electronically as well. ◊ It is possible that paying taxes, trading securities and making some kinds of purchases on the Internet could become routine activities of a large fraction of the population quite quickly (the way making phone calls and driving a car became routine quite quickly for earlier generations). If this happens, the demand for electronic retail payments is likely to rise rapidly as well as to spill over into increased demand for (or reduced resistance to) electronic payments methods for non-Internet transactions as well. • The key unknown here is how fast average people become comfortable with PCS, lose their anxiety about electronic transmission and storage of data and overcome the feeling that if something isn’t written on paper and saved in a drawer it isn’t real. • The usual assumption is that computer literacy is age- and income-related and that it will take a long time before older and less affluent (or at least less educated) people get comfortable with electronic payments and paperless transactions. But we could be in for some surprises as governments move quickly to electronic benefit payments, which mainly affect older and less affluent groups. Aggressive efforts to improve computer instruction in low income schools, compensate for less rich educational environments with creative use of the Internet, and prepare welfare families for modern clerical and data processing jobs could undermine some widespread assumptions pretty quickly, as could aggressive marketing of senior citizen computer access and services. The most rapidly growing demographic group currently on the Internet is retirees. For all of these reasons, I would be willing to bet that, despite the relatively slow pace of change in retail payments over the past several decades and despite Americans’ long-standing affection for the convenience, flexibility and familiarity of the paper check, we are likely to get to some sort of tipping point early in the next century (less than a decade from now) after which electronic retail payments quite quickly become the norm and the volume of checks falls rapidly. This does not mean that I see retail payments moving to some single electronic payment instrument that would substitute for the check in all or most of its uses. Rather, I suspect we will see the evolution of several different systems, adapted to different kinds of payments that consumers and businesses make. Some of these systems are in use already, some will be discussed today, others are not yet ready for prime time. Some will be developed by the banking system, some by non-banks in cooperation, or even in competition, with depository institutions. So what will be the role of the Federal Reserve in all this? As you know, I chair a Fed committee which is explicitly focussed on this question. We issued a report last January and are continuing to discuss how the Fed can best contribute to the efficiency and safety of a retail payments system to which all depository institutions have access. The Committee now consists of Presidents McDonough and Minehan, Governors Kelley and Ferguson and myself. You will hear from Governor Ferguson tomorrow afternoon. We see our role as working closely with the other participants in the payments process to help modernize the retail payments system and facilitate the transition over time to an efficient, convenient, and primarily electronic system. We spent quite a lot of time and energy last year discussing whether we should exit the business of processing check and ACH payments. We decided to stay in and use our operational presence to help modernize both check and ACH and to work actively with others to evolve a vision of a more efficient future payments system and help remove obstacles to its realization. In practice, what does that mean? First, it means increased efforts by the Federal Reserve Banks - as providers of check and ACH services - to develop new products that reduce costs and to price and market them effectively. In the first instance, that means focussing on ECP with truncation and imaging, in order to meet market demand and gain experience with different technologies. We hope to learn a great deal more about whether the wider use of these products can reduce the cost of check collection by substituting electronic technologies for the transportation and processing of paper. We are involved in several pilot studies of ECP, including one which we hope will include all depository institutions in the whole state of Montana. Montana has small banks widely dispersed over great distances. It seems like the right place to see whether, if virtually all the institutions participate, it would be cheaper and more satisfactory to send images electronically rather than to move paper over thousands of miles. Like others who have been studying the business case for ECP and imaging, we believe the jury is still out. It is not clear that moving the whole national check processing system to ECP (with or without imaging) would provide net societal benefits even if the willingness and resources to do it could be found. So you will see us working hard, in collaboration with others in the industry, to explore, in practical ways, the potential advantages and disadvantages of ECP. You will also see us working hard, in collaboration with NACHA and others, to improve ACH service, make it more user friendly and increase its use. You will see us trying, through surveys, focus groups and other tools, to learn more about how consumers and businesses use different types of retail payments and how they think about them. As we have worked on positioning the Fed to help move the nation to the payments system of the future, we realized that many of our assumptions about the composition and use of retail payments, paper checks in particular, are based on information collected by survey in 1979 and were likely seriously out of date. We and the industry need more current information on which to base strategies going forward. We will also be reaching out to participants in the payments system in a variety of groupings to talk about alternative visions of the payments system of the future, what the legal, regulatory and technological barriers there might be to getting there and what we might be able to do to reduce those barriers. We certainly do not see the Fed managing the process of transition to a more fully electronic payments system. We believe strongly that innovations must come primarily from the private sector and be tested in the marketplace to see if they meet customer needs at competitive prices. But there are times when progress can be stymied by legal or regulatory barriers or by the absence of technological standards, especially standards that allow communication among competing systems. In these instances, dialogue and collaborative effort can pay off for the whole system and its customers. The Fed wants to be an active participant in that dialogue along with other providers and users of the retail payment system as we work together to make it better.
|
board of governors of the federal reserve system
| 1,998 | 10 |
Remarks by Mr. Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System, at the Global Year 2000 Summit in London on 16/10/98.
|
Mr. Ferguson remarks on the challenge of preparing for the Year 2000 Remarks by Mr. Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System, at the Global Year 2000 Summit in London on 16/10/98. It is a pleasure to speak with you today in my role as Chairman of the Joint Year 2000 Council. These summits are highly valuable as a forum for sharing information and fostering a shared sense of purpose. Today, I would like to reinforce the importance of the Year 2000 challenge, describe some of the work of the Joint Year 2000 Council, and suggest some actions that you might wish to undertake in order to help your firms, your countries, and the world to prepare for the century date change. Year 2000 Challenge in the Context of Current Financial Turmoil In the past weeks we have heard much about the current challenges facing the global financial system. National economies and international capital markets are facing turmoil, with liquidity and risk being repriced rapidly. Recession is impacting significant numbers of nations, and even the world’s strongest economies are facing unusual threats to continued healthy growth. These uncertain economic times create a difficult backdrop for keeping the appropriate level of focus on long-term challenges, such as the Year 2000 problem. Without in any way diminishing the importance and urgency of the immediate financial challenges, I do want to emphasize the importance of giving continued attention to the challenge of completing preparations for the Year 2000. This may seem like a somewhat arcane topic to discuss as world financial markets struggle to right themselves. However, the current global financial upset has reminded us that the world is truly interdependent and that counterparty and country risks can be, and are, re-evaluated quickly. An exchange rate problem that started in a small economy in Southeast Asia has, within a little more than one year, created a contagion that may potentially impact the growth performance, as well as financial performance, in the most highly developed industrial economies. Similarly Year 2000 problems, if not addressed, remedied and planned for, can have unexpected spillover effects. Because of this interconnectedness, it is most urgent that we complete our Year 2000 preparations and provide full information to the public so that markets are not surprised and can operate as smoothly as possible during the crossover to the new century. It is natural to turn attention to the immediate and very real financial problems; but those in this room, and many others, can provide an important service by remaining attentive to the important task that we face in preparing for the century date change. Interaction between Year 2000 and the Euro The current economic backdrop is not the only challenge facing those of us with a focus on the Year 2000 challenge. Most countries represented here today are also making a significant investment in preparing for the introduction of the euro. While the introduction of the euro is an event to be welcomed by market participants, it does raise several interesting challenges for Year 2000 preparations. First, of course, is the contention for scarce IT resources and the issue of whether there are “economies of scope” in these two projects or at least a benefit in “learning by doing.” The overlap between the two problems and two technical solutions is not as great as many would like. At the narrowest level, they involve different approaches to the technical solution. More generally, the testing approaches are different, with the euro relying on operational testing and Year 2000 being more of a technical test. Finally, Year 2000 involves embedded systems while the euro will not affect systems with embedded chips in most cases. There may also be contention for another limited resource, which is top management attention. Year 2000 was initially considered a technical problem, which many managers may have thought could be best handled by technicians. The euro conversion is obviously a business issue, which entails strategy and is more accessible to the well-informed senior manager. More importantly, the euro may be seen by some as having revenue potential that comes from offering new services ahead of competitors. Year 2000 may offer some revenue opportunities, but it is probably most correctly perceived as a cost concern for many senior managers. Both conversions must be managed as business issues because of the significant risks they both pose to operations. Some commentators may have created another, false, relationship between these two technical challenges by suggesting that the Year 2000 problem is a creation of some that want to distract attention and resources from the euro. I know of no responsible person who would create the need to spend the large sums necessary to prepare for the century date change or risk other disruptions for the purpose of diverting attention from a legitimate and beneficial effort of creating a common currency for a major economic region. In addition, computer professionals have known of this problem for years and are, I believe, universal in recognizing its importance. However, the story is more complex than these elements of resource contention; there are some benefits from having these two major systems changes in close proximity. First, we can all learn about the importance of cross-firm coordination in tackling these systems issues in the new era. In addition, there will be learning on how best to perform contingency planning for these cross-firm, cross-border events. Finally, the euro conversion experience may be seen as a leading indicator for the Year 2000 weekend that will follow a year later. If things proceed well, many may interpret that as lessening the pressure, and efforts to prepare for Year 2000 might appear to have a lower priority. If euro conversion is not as seamless as expected, the perceptions of Year 2000 risk will likely increase, possibly leading to excessive concern and overreaction. It will be incumbent upon knowledgeable and balanced commentators to assess the situation coolly and point out appropriate similarities and differences between the two events. Disclosure and Transparency One way in which we can all ensure that balanced and rational discussion prevails is by adequate disclosure and transparency. The lack of adequate information on Year 2000 preparations by firms and countries is a significant concern because, as we have seen recently, lack of information may feed negative perceptions in the marketplace. The interdependence of financial institutions, markets and payments systems, and the overall dependence by firms on the readiness of the public infrastructure, make it very important for firms and countries to share information on the status of plans and preparations. The Year 2000 problem could potentially lead to a cross-border reassessment of counterparty and country risks similar to that occurring now. Unless we continue to make steady progress in preparing for the century change and communicate clearly how we are doing, we may experience what I call the “Year 2000 shadow” in the financial markets. Financial market participants, and entire countries, may find that capital will become scarce, or at least dear, if they are not seen to be making sufficient progress toward resolving this problem. The uncertainty surrounding preparedness for Year 2000 may make markets less liquid as institutions seek to insulate themselves from risk with counterparties who are thought to be unprepared. The financial cost of this is not clear, and I am not one of those forecasting recession as a result of a Year 2000 slowing, but we know from recent events that a flight to liquidity can have severe repercussions in the real economy. In the United States, legislation has recently been passed by Congress to protect firms that share information in good faith on the status of their Year 2000 projects. In July, the US Securities and Exchange Commission issued new guidance for publicly traded companies requiring them to disclose information regarding the scope of their Year 2000 problem and the status of their Year 2000 readiness. These actions are intended to prompt firms to be more forthcoming in providing information so that firms and investors may respond appropriately. While disclosure rules like those mandated by the SEC are a positive development, the Year 2000 process itself is constantly evolving, so formal disclosures will almost certainly be months behind the marketplace in accessing the status of firms. Voluntary disclosure, therefore, must form an integral part of plans. Firms that are forthcoming and disclose their status in forums like the Global 2000 meetings set an example for other firms in their countries and provide the marketplace with vital information. In the United States, the oil and gas industry associations recently surveyed their members on the status of company readiness. While the survey revealed that 45 per cent of the respondents are still in the assessment phase, there has been no negative reaction in the marketplace or in the public press to the information. One probable reason for this reaction is that the survey and public disclosure conveys a message that companies have recognized the risks and have implemented programs and resources necessary to address the Year 2000 problem; information is reassuring. The Joint Year 2000 Council The Joint Year 2000 Council is providing a complement to these private sector efforts. We do not represent the global “official sector”, but we are attempting to coordinate and share information among the world’s financial regulators. To that end, we have issued a number of policy papers to provide regulators with a strong sense of better practice. The Council also provides information through a bulletin and a website. We also plan to sponsor a series of regional meetings before the end of the year to bring together regulators at a regional level to discuss progress and to provide solutions to challenges. Finally, we have held conversations with senior members of the private sector who are members of our External Consultative Committee to impress upon them the important role that they play in making these preparations. Early in its life, the Joint Year 2000 Council issued a statement emphasizing the importance of each country’s developing a national Year 2000 strategy, including having senior government coordination. In the United States we have created a President’s Council, which is one example of government coordination of Year 2000 preparations. This has been a particularly useful tool for allowing further information sharing and disclosure, as well as speeding preparations. In spite of these efforts and examples of good practice, some countries will probably not be as prepared as we would hope. Time will run out for a few; however, that should not stop all countries and firms from continuing their best efforts. It is never too late to start the remediation process, focused in the first instance on the most critical systems, and working diligently to seek the highest level of preparation possible. The risks of some disruption to international trade and financial markets are certainly not negligible, but these risks can best be handled by continued hard work and by planning for current and likely contingencies. Plan of Action As we move forward, I would like to suggest some steps we can all take. First, you and your colleagues are critical in continuing to build awareness within your financial and business communities and within your countries, about the importance of focusing on the Year 2000 so this event does not become the next crisis impacting financial markets. Second, some governments do not yet fully appreciate the magnitude of the Year 2000 problem. The Joint Year 2000 Council has an active program to raise awareness among financial regulators worldwide and to share information on important Year 2000 tasks. You can assist these efforts by raising the topic with political and regulatory leaders in countries in which your firms are active and providing your market-based sense of the risks. Third, I believe that accurate and timely information is the best antidote to the misinformation that is likely to spread as we get closer to the critical dates. I would, therefore, strongly encourage the efforts like those currently under way by Global 2000 to obtain voluntary disclosure of information about the status and readiness of firms. I expect that market participants and the public at large will start to expect more information, and those that do not disclose face the risk of being considered unprepared for the Year 2000 transitions, raising business viability and even systemic concerns. Fourth, the “dependencies” in the public infrastructure that we all have, namely telecommunications, power, and water, need further attention. This is a vital link that is becoming increasingly obvious. I would encourage you to have conversations with your utilities and telecommunications providers, and with appropriate regulatory authorities, about preparedness of the industries that are vital in operating the public infrastructure. Utilities have been particularly cautious in explaining their preparedness. More disclosure from them would be useful. Finally, an important and responsible part of any project includes contingency planning. Contingency planning should be recognized as a legitimate exercise, not an admission that you expect failure in the core effort of preparing for Year 2000. Conclusion In closing, let me say that we have achieved much in the past few months. The level of awareness and appreciation for the seriousness of the risks posed by the Year 2000 is growing every day. For example, despite many competing issues and concerns, the Year 2000 was addressed during the recent meetings of the IMF and World Bank. The Joint Year 2000 Council is committed to accomplishing its goals of sharing information and helping supervisory authorities perform their duties. We also remain committed to work with our colleagues in Global 2000 as they encourage the private sector to take responsibility for their preparations. We must also continue to work hard to encourage governments and companies that provide public infrastructure services to prepare for the Year 2000. Finally, we should all encourage disclosure and sharing of information by firms and governments. I am confident that the current business and technical headlines, whether they regard international financial contagion or euro conversion, will continue to receive the appropriate level of private-sector and official attention. However, the clock will march inexorably forward, and within the next fourteen months we will all know how prepared we collectively are for the century date change. With concerted effort, I hope that we will not be found wanting.
|
board of governors of the federal reserve system
| 1,998 | 10 |
Speech by Mr. Laurence H. Meyer, a member of the Board of Governors of the US Federal Reserve System, at the 16th Annual Monetary Conference, Cato Institute, Washington, DC on 22/10/98.
|
Mr. Meyer reports on increasing global financial integrity: the roles of market discipline, regulation and supervision Speech by Mr. Laurence H. Meyer a member of the Board of Governors of the US Federal Reserve System, at the 16th Annual Monetary Conference, Cato Institute, Washington, DC on 22/10/98. Recent developments, both at home and abroad, have reinforced the importance of efforts to increase the stability of global banking and financial markets. The central challenge of promoting the stability of the global banking system is sometimes viewed as bringing banking systems around the world up to best practice standards, as exemplified in the supervisory and regulatory frameworks of advanced industrial economies. While this is, to be sure, a priority - at least for internationally active banks around the world - let me offer a more comprehensive framework for thinking about enhancing the stability of the global banking system. First, best practices is a dynamic, not a static concept. The rapid change in banking markets and practices, and more generally in financial markets and instruments, challenges supervisors and regulators around the world to adapt in order to maintain the effectiveness of regulatory standards and supervisory practices. Best practices simply must get better over time or they will soon not be good enough. Second, best practices may not be the same for all banking organizations. It is time to think about not only enhancing the contributions of market discipline, reforming capital standards and enhancing supervisory practice at the largest and most complex banking organizations, but also about differentiating how we regulate and supervise these institutions relative to small and medium-sized banks. Third, we have to continue to work on appropriate global norms and make greater progress in achieving greater convergence to best practices - including, perhaps, different best practices for different classes of institutions - around the world. This means appropriate investments in developing the standards, in technical assistance, and in monitoring compliance. I am going to apply this broader perspective on global banking to three pillars of an efficient and effective supervisory and regulatory framework for banking: market discipline, regulation, and supervision. First, I will discuss the potential for enhanced disclosure and improved incentives to increase the effectiveness of market discipline in controlling risk-taking by banks. Second, I will discuss the importance of reforming our international capital standards as an example of dynamic best practices. Third, I want to emphasize the important role for bank supervision. I. Market Discipline: Information and Incentives There appears to be a renaissance in appreciation of the contribution of market discipline in banking. This undoubtedly reflects a growing awareness that recent developments in banking and financial markets have made regulatory standards less effective and supervision more challenging, that it may be simply impossible to adapt regulatory standards and supervisory practices fast enough to keep pace with market developments. Of course, market discipline does not come easy to banking. We start by establishing a safety net for banks, including deposit insurance, access to a lender of last resort, and payment system guarantees. This immediately reduces the incentives of market participants to monitor the risk-taking of banks and, in turn, encourages excessive risk-taking by banks. We then put in place a regulatory and supervisory framework to counter these moral hazard incentives. And, when we are done, we complain about the absence of market discipline. I appreciate that this sounds just a bit contradictory. I could be more consistent and perhaps more appreciated at this institute if I urged the dismantling of the federal safety net and the full return to reliance on market discipline. But it is clear that society has made one of those irreversible choices and will not choose to eliminate deposit insurance, the discount window, a Federal Reserve payments system, or bank supervision and regulation. Moreover, my reading of history suggests that, despite the strong indications of the success of market discipline, the cost of the associated financial market disturbances on real economic activity are too high for market discipline to be the sole regulator of banking. I understand opinions differ on this, just as they must when supervisors and regulators make judgments to act at specific times. But, what is worth understanding is that there is now general agreement that moral hazard is a significant problem, that the markets are increasingly complex making it more difficult for supervisors and regulators - and that supervision and regulation have significant costs and inefficiencies. As a result, we must begin to increase our reliance on market discipline both as a governor and as an indicator. When I talk about market discipline, some hear a call for a substitution of more market discipline for less regulation and supervision. Listen more closely. My concern is that the task of regulation and supervision has become more challenging and we therefore have to increase the effectiveness of our overall regulatory and supervisory framework to keep pace. I am referring here to the increased size, breadth and geographic scope of banking and to its increased globalization and complexity. I want to achieve this increased effectiveness by enhancing market discipline, modifying our regulatory standards, and adapting supervisory practice. We also want an efficient regulatory framework, one that achieves its objectives at the lowest cost. Where there are opportunities to achieve the same degree of stability through increased reliance on markets and reduced emphasis on regulation and supervision, we should, of course, do so. But right now, my focus is on increasing the overall effectiveness of the framework and in relying as much as possible on market discipline and improved capital standards to get this job done. It is one thing to extol the virtue of markets and another thing to find ways to make them work more effectively in controlling risk-taking in banking, at least in the presence of the safety net. The time has come to move beyond rhetoric to specific actions. Within the Federal Reserve System we have a variety of working groups and committees working to this end. We also have the benefit of ongoing work and recent reports by the Basle Committee on Banking Supervision and the Group of 22. I expect that these efforts will result in concrete proposals soon. Market discipline is about information and incentives. Market participants must have access to reliable and timely information on the financial conditions and prospects of banking firms, and both the market participants and the firms must have incentives to respond to this information. With respect to information, we need to consider what additional disclosure would contribute to enhanced market discipline. Providing incentives is important because such incentives can encourage banks to provide voluntarily the information the market demands. I have not come with a specific set of recommendations for improved disclosure. But two common principles apply here as well as to issues related to regulatory reform and adaptation of supervisory practice. Banking systems in emerging economies have a long way to go to catch up to global norms with respect to disclosure. Banks that want to play in global markets should meet international standards with respect to transparency and disclosure. Banking systems in developed economies, in turn, should not be resting on their laurels when it comes to transparency and disclosure. Here again “best practices” has to be understood as a dynamic process. The increased complexity of banking, the speed with which risk positions can change, and the increased size and breadth of banking organizations all seem to point to the usefulness of improving disclosure to provide the raw material the markets need to continuously reevaluate changing risk profiles. Improved disclosures about risk management practices, risk profiles, and risk management performance - coupled with disclosures about regulatory capital and its components, risk-weighted assets and related matters - could provide transparency that will facilitate market discipline. Timely disclosure of this information would enable market participants and supervisors to better assess how institutions maintain sound capital in relation to their changing risk exposures. Some of these types of disclosures are among those set forth in the recent Basle Committee paper on Enhancing Bank Transparency. International disclosure standards should include, for example, guidelines for reporting non-performing loans, for provisioning for non-performing loans, and for classifying the quality of loans. Upgrading international banking statistics to provide more detailed data on international exposures would enhance the ability of both supervisors and market participants to assess the vulnerability of domestic banking systems to financial shocks from abroad. It would also be useful to provide better information about the complex activities of the largest, internationally active banks, such as securitizations, credit risk modeling and credit derivatives. The second foundation of market discipline is incentives. One promising direction might be to require at least the largest and most complex banks to issue minimum amounts of subordinated debt as a way of enhancing market discipline. To a degree I view this as a metaphor for specific steps to enhance market discipline, to emphasize that we have to move beyond rhetorical praise for markets to specific steps to harness the role of markets in monitoring risk-taking by banks. There may be other approaches that can accomplish this task, as good as or better than subordinated debt. But, right now, I do believe there is considerable promise in a subordinated debt requirement. An appealing aspect of this approach is that subordinated debt holders, so long as they are not bank “insiders,” face only downside risk, and thus their risk preferences are very close to those of the FDIC. Subordinated debt holders would therefore be expected to impose market discipline on the bank that is quite consistent with what bank supervisors are trying to do, including encouraging banks to disclose more information about their financial conditions. Observed risk premiums on subordinated debt could perhaps be used to help the FDIC set more accurate risk-based deposit insurance premiums, and such debt would provide an extra cushion of protection to taxpayers. An additional benefit of having subordinated debt traded on the open market, at least if the market for subordinated debt was sufficiently liquid, is that price movements would provide a clear signal of the market evaluation of the bank’s financial condition that would serve as an early warning signal to aid supervisors. Such an approach would most likely be limited to the largest banks for at least two reasons. First, it is the increased size of banking organizations and the financial innovations of the largest and most sophisticated banks that are challenging the effectiveness of the current regulatory and supervisory framework. Second, it is unclear just how deep and liquid a market for bank subordinated debt would be and what access small and medium-sized banks would have to this market. This suggests that an operationally feasible program of mandatory subordinated debt would require a considerable amount of further thought. II. Reforming Capital Standards Market discipline, as I noted, is a complement to the regulation and supervision required by the safety net. The foundation for bank regulation has long been capital standards. Since 1998, these have been international standards as embodied in the Basle Accord. Capital standards can be thought of as a way of reinforcing market discipline, because they shift more of the risk to shareholders relative to taxpayers. In this way, capital is the closest relative to market discipline among bank regulations. That is the good news about capital standards. The bad news is they are becoming increasingly less meaningful and progressively undermined as a result of regulatory capital arbitrage. Regulatory capital arbitrage refers to the gaming of the capital standards, the exploitation of loopholes that allows banking organizations to lower the amount of capital for a given level of risk. Regulatory capital arbitrage has been encouraged by the limitations of and excessive rigidity in the current capital standards. To be sure, it serves as a kind of safety valve, preventing the capital rules from distorting bank behavior in non-economic ways. But capital arbitrage also undermines the effectiveness of our capital rules and creates some economic distortions. Banks engage in regulatory capital arbitrage mainly through securitizations and credit derivatives. These vehicles offer improved opportunities for banks to manage their risks and liquidity. Securitizations allow banks to transfer the risk of an underlying loan pool to capital markets, retaining their role in the origination of loans, but transferring more of the ultimate funding to the capital markets. Credit derivatives are another vehicle for shifting some of the risk of the underlying loan pool, while avoiding disclosure of an effective loan sale and possibly harming a customer relationship. There are two problems with the use of securitization and credit derivatives. First, banks often use these vehicles to reduce capital charges by more than they reduce the risk associated with the underlying loan pool. This is the problem of limited transference of risk. Banks generally retain the risk of “first dollar” or “second dollar” loss on the underlying assets via various credit enhancements. Indeed, through devices such as “remote origination” and “indirect credit enhancements” the regulatory capital associated with retained risks can be very low, even zero. As a result, some refinement in the capital treatment of securitizations and credit derivatives is needed to better match the capital with the retained risk by banking organizations. But the second problem is the more difficult one. Securitizations and credit derivatives are basically exercises in cherry picking. That is, they allow banks to transfer the high quality assets out of their banking book. Banks do so for good reason, to be sure. They do so because our current capital standards apply the same risk weight to all non-mortgage loans, and the regulatory capital charges for the highest quality loans are often well in excess of the economic capital that banks internally allocate to such loans. In a world of increased competition, banks have to use every opportunity to manage their risks and capital to best advantage. And that means either not originating such high quality loans or finding ways to make the regulatory capital converge to their estimate of the economic capital of such loans. Regulatory capital arbitrage is a safety valve because it allows banks to stay in the business of making such high quality loans. The main problem here is not the reduction in capital charges against the high quality loans, although this does raise tension with prevailing international capital standards. The most serious problem is the increased risk of the residual loan portfolio, the portfolio that has been effectively stripped of its high quality assets for which regulatory capital exceeds economic capital. The remaining loans, to the extent this process has been carried out to its logical conclusion, all have economic capital equal to or in excess of the regulatory capital charges. The 8 per cent capital charge for the average loan portfolio may, as a result, be quite inadequate for the residual higher-risk loan portfolio. On the other hand, the 8 per cent risk-based capital standard may still be adequate for banks that do not engage in cherry picking via securitizations and credit derivatives. Therefore, as we move toward reform of the international capital standards, we should not necessarily apply the same capital standards to all banks. The point here is not to blame banks for engaging in regulatory capital arbitrage. The fact is that regulatory capital arbitrage is an outgrowth of the considerable resources that banks have devoted to better measuring, pricing and managing risk. And it is also a response to serious shortcomings in our international capital standards. The solution is not to reign in banks, but to catch up to banks, specifically to reform our international capital standards in a way that takes advantage of some of the advances in credit risk measurement and management in the industry. I am not going to be more specific at this point. Here too the Federal Reserve has been actively engaged in analyzing the emerging tensions between bank practices and regulatory rules and has been working toward recommendations for reform. As you know, regulatory capital standards, at least those that apply to internationally active banks, require international consensus. The Basle Committee on Banking Supervision, under the leadership of William McDonough, President of the Federal Reserve Bank of New York, is moving toward active discussion of reform of the Basle Accord and the Federal Reserve will be an important contributor to this deliberation. Without laying out the specifics of a reform effort, the direction must be to ensure that regulatory capital charges, especially those related to the banking book, better match economic risks. The Federal Reserve Board’s website (federalreserve.gov) has several excellent staff papers on these issues that I call to your attention. Today the same capital charge is assessed against a loan to a AAA-rated company as to a loan to a junk-rated company. As a result, banks tend to have very similar regulatory capital ratios despite the fact that they have quite different risk profiles. Fortunately, the largest and most sophisticated banks can and do differentiate between their loans for the purpose of assigning internal economic capital as part of credit risk management. Regulators have to take advantage of the same improvements in credit risk measurement and management to achieve this improved matching of capital to underlying risks. The problem is that, at this point, there is a considerable diversity in the quality of such risk measurement and management across banks and some serious limitations to the robustness of internal risk measurement models. So we need to encourage both continued advances in risk measurement and management and convergence toward best practice within the industry, at least for the largest, most complex and internationally active banks. The fact is, we may not yet be where we need to be in terms of credit risk measurement and management to implement a new capital standard that effectively resolves problems associated with regulatory capital arbitrage. So as we move toward a reform of international capital standards that harnesses the risk measurement improvements in banking, we need parallel efforts in credit risk measurement and management within the banking industry. If we can match these two efforts, we will hopefully be able to achieve meaningful reform of our capital standards. This would hopefully allow us to eliminate or at least significantly reduce incentives for regulatory capital arbitrage, ensure that risk-based capital ratios are more meaningful, and enhance the role of capital standards as the foundation for bank regulation. III. The Importance of Supervision At its best, regulation is a rather blunt instrument and, as a result, in the US great emphasis has been placed on bank-by-bank supervision, including on-site examinations. Historically, US examiners focussed almost solely on uncovering deteriorated or troubled risk positions; in effect, they carried out a “classified assets” review. But we recognize now that good supervision is proactive. It is, of course, important to force banks to recognize problems promptly when they occur; it is even better to make sure that the probability of a problem occurring is contained to begin with. To this end US supervisors now conduct examinations of bank risk measurement and management processes, not just examinations of bank assets. This also reflects the new dynamics of bank risk-taking, the prospect of rapid change in underlying risk because of the changed nature of bank practices, changes in the technology that speed changes through markets, and changes in financial instruments. The old approach was more like taking a snapshot of troubled assets at a point in time and returning the following year for another picture. The new approach is one that features monitoring of the systems and procedures used to measure risk by the bank on a continuous basis, with the details of the examination process tailored to the complexity of the transactions and to the areas within a banking organization where the greatest risks are concentrated. The pace at which financial innovation is occurring is leaving gaps in banking regulation, which will take time to fill. There is no alternative but for supervisors, during this interval, to be more alert and to recognize the greater role for supervisory discretion. In a recent supervisory letter, for example, the Federal Reserve encouraged its supervisors to be sure that the robustness of a bank’s credit risk management was scaled appropriately to the complexity of its operations. Specifically, banks that aggressively participate in cherry picking via securitizations and credit derivatives should be expected to have in place a credible system of internal credit risk rating and should be prepared to assess the economic capital appropriate to the residual banking portfolio. Having said all this, we have been reminded over and over again that some things change, but others stay the same. It is important to recognize, as recent experience has forced us to, that new financial instruments often involve the same old types of risk. A good example is derivatives. US banks, for example, hedged positions in rubles without adequately considering the credit quality of the hedge counterparties, specifically the Russian banks that were the counterparties to foreign exchange hedges. In the case of derivative positions that banks had with hedge funds, the risk was not so much the potential for rapid change in the value of the underlying position, given that banks insist on collateral backing on contracts for which they are in the money and also generally hedge these positions. However, assessment of the risk of the counterparties remains an important task, because when a counterparty defaults banks lose one end of their hedged positions and must quickly replenish such positions in volatile and illiquid markets. And, of course, in Asia, Russia, and elsewhere banks lent to domestic businesses with poor underlying financials, sometimes because of encouragement of the government, sometimes as a result of a system of connected lending, and often because of expectations that domestic governments would bail out the important businesses. Put differently, banks most often still lose money the old-fashioned way: poor credit judgments and bad luck. We should not lose sight of this as we adopt new procedures and techniques. Whether for the old or new techniques and problems, the culture of on-site supervision is much less strong in other countries and in some cases is almost non-existent. Indeed, in some cases, supervision consists entirely of reviewing publicly available financial statements to determine compliance with prudential regulations. This is another area where convergence is badly needed. IV. Conclusion Banks in emerging economies are increasingly becoming players in global banking. To maintain the stability of the global banking system it is important that these banks converge toward best practice standards and that the regulatory and supervisory infrastructure in these economies similarly meet global norms. There is an important role in this process for international standard setting, technical assistance, and monitoring compliance with these standards. At the same time, consolidation, globalization, and the increased complexity and breadth of activities are increasing the challenge of regulating and supervising banks around the world. Best practices must therefore keep pace with market developments. I believe that a recipe for dynamic best practices involves enhancing the role of market discipline, reforming international capital standards, and improving supervisory practice. * * * This BIS Review is available on the BIS Website (www.bis.org). _____________________________
|
board of governors of the federal reserve system
| 1,998 | 10 |
Remarks by Vice-Chair of the US Federal Reserve System, Ms. Alice M. Rivlin, at the State University of New York-Brockport and Buffalo State College in Buffalo, New York on 22/10/98.
|
Ms. Rivlin discusses the state of the world economy and its implications for the United States Remarks by Vice-Chair of the US Federal Reserve System, Ms. Alice M. Rivlin, at the State University of New York-Brockport and Buffalo State College in Buffalo, New York on 22/10/98. I am delighted to be here today and to have this opportunity to talk with you. I am particularly pleased to be here with my friend Congressman John LaFalce, who brings such energy and good sense to the Congress, especially to the hard work of the Banking Committee, which is where I see him most frequently. I want to talk with you this morning about the turmoil in the world economy and what it might mean for us here in the United States. Why should people in upstate New York care about the economy of Thailand or Russia or Brazil? What can we do to keep our own economy strong? We are all used to the cliche that the world is getting smaller, that the electronic age has brought us closer and put us in instant touch with remote parts of the globe. We know that cross-border trade and investment have increased enormously in recent years, bringing both greater prosperity and greater interdependence. Still, when bad news from Asian economies began flashing across our screens in the summer of 1997, it didn’t seem to have much to do with us. Currencies with odd names, like the baht and the ringgit were said to be plunging, stocks prices on the Hang Seng or the Nikkei were falling, by the end of the year Korean banks were negotiating with their creditors. But here at home the economy was doing better than it had in a generation. The news from Asia seemed like a small dark cloud appearing on the horizon in the middle of a sunny day at an outdoor festivity. It might mean rain, but with any luck it would go away. We have been enjoying a truly extraordinary performance of the US economy. Since the beginning of the 1990s, we have had a long period of growth that didn’t seem to be sputtering out. In fact, growth accelerated in 1997 and early 1998, which is not the usual pattern for a “mature” business cycle. We have seen few signs of strain or imbalance. Although some areas, including Upstate New York, have lagged, we have experienced mostly good growth across the country and across sectors of the economy. Unemployment has been very low - lower than most economists thought it could remain for long without igniting inflation. Wages have been increasing, but not so fast that the rise caused rapid escalation of costs. The slowdown in the rise of health benefit costs has held total compensation increases to moderate levels. Productivity increases have been offsetting compensation growth, keeping profits growing, at least until quite recently. Most remarkable of all, inflation has been falling and not showing any clear evidence of accelerating again. Even the Asian crisis at first seemed an ill-wind whose timing was actually blowing us some good. Some Americans felt the negative impact early, especially manufacturers whose exports to Asia declined and farmers who saw their prices drop on world markets as Asian demand fell off. At the national level, however, our economy was clearly growing at an unsustainable rate, and some slowdown was actually welcome. We were in danger of running out of workers. Sooner or later cost pressures would lead to inflation. The primary economic risk seemed to be overheating, not slow growth. Indeed, the Federal Reserve was concerned enough to tap the brakes by raising the short term interest rate in March 1997 before the Asian crisis hit that summer. We would probably have felt called on to raise rates again later in 1997, but by then it seemed likely that slower growth in Asia would cut US net exports, and restrain our economy enough to avoid inflation and any pressures. Good productivity performance, falling commodity prices, combined with low inflation expectations and the high value of the dollar, all made it less risky to wait and reduced the necessity of the Federal Reserve taking preemptive action against inflation by raising rates again. The stock market’s remarkable rise both reflected and perpetuated the general good feeling about the US economy. Even the Asian crisis raised the price of US stocks, as investors took their money out of Asian markets and bought US securities instead. To be sure, US equity values were outrunning reasonable expectations of profits, especially if, as many forecasters expected, the economy slowed down. But increased wealth of consumers stimulated consumption and the low cost of equity stimulated investments, so chances of future profits were enhanced. Some worried that high equity values were a bubble that would inevitably burst, but this concern was mitigated by: (1) lack of strong evidence that stock prices were part of a wider speculative phenomenon that included real estate and other assets, as in some previous boom periods; (2) the apparent new maturity of small investors, who talked and acted as though they were in the equity market for the long haul. All in all, it was a sunny day at the fair. All of this good news generated an occasionally silly debate about whether the US was in “new era” in which all the old rules had been suddenly repealed, or whether there was a “new paradigm,” which usually seemed to mean “no paradigm” or “the sky’s the limit”. The more sensible version of the discussion went something like this. There are lots of reasons why the economy might be both more productive and less inflation prone than it used to be: • • • • • more global competition deregulation of airlines, trucking, telecommunications; decline of barriers among financial services the computer and telecommunication revolution (finally) making cost savings possible, not just in manufacturing, but in services more flexibility and incentives built into wages, less unionization, out-sourcing of specialized services, use of temporary and part-time employees high levels of investment, including human investment - on-the-job and other skill training. In economists’ terms, there was hope that productivity might have moved onto a higher growth trend. Faster productivity growth would mean a higher standard of living for Americans in the future. It would mean that growth could be faster without setting off inflation (perhaps closer to 3 percent a year than 2 percent) and that unemployment could be lower without inflationary consequences. That would be very good news. It was too soon to be sure that the apparent productivity trend increase would hold up or that less inflation was becoming the norm. It was certainly true that some elements of the unusual combination of favorable inflation factors were likely to reverse. Health cost increases might accelerate again; the dollar might weaken and cause import prices to rise; oil and commodity prices might go up again. Still, the US economy seemed to be performing remarkably well. Then ill-wind from Asia turned to a gale, sweeping through currency and asset markets all around the world and affecting our own financial markets as well. The stock market dropped 5 percent in one day in August, then bounced wildly around. A general “flight to quality” drove long-term interest rates down on Treasuries, but rates on riskier financial instruments went up. The market mood changed to pessimism and risk aversion. Meanwhile, across the Atlantic, the initial storm warnings from Asia, were also first dismissed by many Europeans as far away and not their problem - distant thunder on another shore. European economies are less heavily involved with Asia than we are and, besides, the ill-wind was blowing them some good in the form of lower oil and other commodity prices. The major countries on the continent were working hard on the details of monetary union and organizing a multi-national central bank; they were struggling earnestly to get deficits and inflation down. The basically simple idea of having a common currency and a common central bank proved very difficult to achieve technically and politically, but they had done it. Growth rates were finally picking up and even high French and German unemployment rates were beginning to recede. Europeans were optimistic that the common currency, and the increased competition and deepening capital markets it was likely to engender, would energize their economies even more and lead to higher growth. This unpleasant weather on the horizon seemed at first a minor distraction. But the lightening bolt in Russia was more difficult to ignore; it was right there on their borders. In Japan, the Asian crisis had a much greater impact and came at a much worse time. Japan is closely involved with other Asian countries and suffered both financial reversals and loss of markets for Japanese goods. They were already struggling with slow growth, a banking system in need of overhaul and a political party structure in need of rejuvenation. The Asian crisis precipitated deep recession in Japan. The world economic turmoil could not have come at a moment when the Japanese were less able to deal with it. In the last few months, we have learned so much about the downside of an increasingly interdependent world that it is important to remind ourselves that the upside is enormous. Increasing volumes of trade and cross-border investment have raised the standard of living in many emerging countries at an astonishing rate. Trade and foreign investment have moved the countries of southeast Asia in less than a generation from traditional agriculture to modern industrial societies with far less poverty, much more skilled work forces and rising standards of living. At the same time, we and other industrial countries have benefited from growing markets for our exports, higher profits on our capital and lower prices for the wide variety of goods we import. When things are going well in an interlinked world, one country’s prosperity reinforces another’s. The rapid growth of emerging market countries, especially in Asia, in the last couple of decades, led to greatly increased trade among them as well as with industrial countries. The self-reinforcing growth seemed so positive that investors began to view investing in emerging markets as not much more risky than investing in their own more familiar industrial country markets. Capital flowed freely to finance projects in emerging market countries with lenders demanding only slightly higher interest rates than they would have required to lend at home. But the downside of interdependence is that when something goes wrong, the impact also spreads quickly from one country to another, through trade channels, prices and asset markets. Asian countries in trouble quickly cut their imports from each other and from the industrial world. The reduced Asian demand for food, oil and raw materials was felt by farmers in the US and Canada, Australia and New Zealand; by producers of oil in Mexico and Venezuela, Nigeria and Russia and the Middle East; by copper producers in Chile and exporters of industrial raw materials in Africa and elsewhere. Companies in Rochester and Buffalo found their sales falling. As the contagion spread, investments in many emerging market countries began to look less promising. Investors became generally more cautious and began looking for safe places to put their money - like buying US government bonds - even at low rates of return. As they saw US companies affected, they became less eager to finance them as well. We’ve learned a lot in the last few months about the power of contagious loss of confidence in an interlinked world. The meltdown in Russia was the clearest case. Russia is not an important economic and trading power in itself. Losses to major money center banks, here and in Europe, though magnified by unexpected interrelations with hedge funds and other operations, were not in themselves destabilizing. But what apparently mattered was the surprise, both of the suddenness and unpleasantness of the Russian default and of the fact that the Western “powers” were so powerless. No one thought the US and its allies would take the political risk of letting Russia fall apart. In the end, there was nothing they could do to stop it. The Russian collapse led investors to pull back just about everywhere. Some who had borrowed to invest in Russia had to sell assets in other markets to cover losses. Others simply said to themselves, “If it can happen in Russia, it can happen anywhere; I had better get my money to a safe place.” The result was plummeting markets in Latin America, South Africa, Asia and elsewhere, more difficulties for the countries struggling to recover. It is now clear that the Asian crisis that started in Thailand in the summer of 1997 - and seemed like a remote event of no great significance to most Americans - has set off a chain reaction reaching around the world. Much of Asia is in deep recession, the Russian economy has collapsed with negative repercussions for many of its neighbors, growth is slowing in Latin America. Risk aversion of investors has made it difficult for any country, company or venture perceived as at all risky to get capital or to borrow, even at high rates. Countries like Brazil, Mexico and Argentina, considered quite strong a few months ago, are under mounting pressure. Brazil has had to raise interest rates, use much of its foreign exchange reserves to keep its currency from depreciating; the high interest rates, in turn, have greatly exacerbated Brazil’s budget deficit, and the deteriorating outlook has led to fears that investors will pull their capital out and bankers refuse to roll over debts. Economic collapse in Brazil, which is Latin America’s largest economy, could precipitate serious trouble in other Latin American countries and then reverberate back around the globe to deepen the trouble of Asia, including Hong Kong, China and Japan. Another round of deepening deterioration would surely put intense downward pressure on the US economy by cutting our exports, depressing equity markets and weakening investor and consumer confidence. As Chairman Alan Greenspan has said, we cannot expect to be an “oasis of prosperity” in a deteriorating world economy. Even if we could stay prosperous alone, we should not want to take the risk that deteriorating economies in the rest of the world would augment unrest, terrorism and international conflict that could engulf the United States. What’s to be done? In the near term there are clearly two priorities: ⇒ Stop the contagion so that emerging market countries can begin to get back on their feet. ⇒ Keep the world’s largest economy from slowing down too much - for our own sake and the world’s. There is no easy, simple answer to stopping contagion, if indeed there is any answer. What would be the elements of a plan? First, countries most in danger of being the next to fall have to be willing to face up to their vulnerability quickly. (There is no point in moaning, “It’s not our fault, we didn’t do anything wrong; we just got caught in the world backwash.” That’s partly true, but not helpful.) They have to be willing to call in their creditors to negotiate rollovers, ask the IMF for conditional loans, offer significant and credible internal reforms in return. Nowhere is this easy. Second, private creditors have to see their long-run interest in hanging in, even on concessional terms, rather than joining a stampede for the exits. Third, the international financial institutions and the industrial countries have to find more funding to tide these countries over the crises if they give evidence of willingness to accept strict conditions. Congress’ passage of the IMF quota increase and the New Arrangements to Borrow remove a major uncertainty about the effectiveness of collective action and should do a lot to restore confidence that the chain reaction of economic deterioration can be stopped. Keeping our own economy growing would be important even if the rest of the world were not depending on us so much. Slower growth and rising US unemployment would mean fewer opportunities for Americans to earn income and move to higher skilled and better paid jobs, less chance for families to move out of poverty and from welfare to work. Slow growth or recession in US would also have serious consequences for the rest of the world, reducing our ability to buy and invest abroad and exacerbating economic downturn everywhere. At present, the chance of recession in the US does not seem high. Recent growth has been strong and balanced with few indications of strains and speculative excesses that typically go with boom/bust cycles. Most forecasters expect the US economy to slow; the hope is that it will not slow too much. The Federal Reserve, recognizing that the balance of risks has shifted from overheating to cooling off, has cut short term interest rates twice. Other industrial countries may follow our lead (Canada already has); although the major continental European countries feel easing is less appropriate for them as they consummate their monetary union. It is clearly important for the Japanese to get their economy moving again - to stop being part of the problem and be part of the solution - as Americans and Europeans have been pointing out, none too politely, for months. The Japanese were not shy, when they were riding high in the ‘70s and ‘80s, to offer advice to the Western industrialized countries about modernizing our manufacturing and management practices, working harder, getting budget deficits down. So there may be a payback element to the US and European shrillness about how important it is for the Japanese to use aggressive fiscal stimulus to get their economy going again and to restore confidence by drastic and immediate restructuring of their banking system. Neither is going well. A series of stimulus announcements, mostly focussed on increasing government spending for public works has not yet succeeded in stemming the economic slide. A next step might be a significant tax cut, but with Japanese consumers and investors both feeling anxious about the future, there is no guarantee that even a large tax cut would result in much near term spending. Japanese banking system restructuring is going slowly, although it does seem to be moving toward resolution. We are fine ones to talk, since we let our much smaller savings and loan crisis fester for a long time before facing up to it. With hindsight, the Japanese could have restructured their banks much more easily three or four years ago. The worst time to restructure failing banks is when macroeconomic conditions are reducing the value of their assets even further. Drastic action now may make the recession worse in the short run, but it may also be the only hope for restoring confidence and financial health in the future. Let’s suppose this crisis turns around: the contagion is stopped, Asian and other emerging market countries begin to grow again; Japan pulls out of recession and begins to contribute with its recovery to recovery of its neighbors; North America and Europe sustain healthy economies. Then the big question is: how to avoid such a costly episode happening again. The real costs are not monetary; they are disrupted human lives and hopes. Millions of people in emerging market countries are suddenly without jobs, food, basic necessities, children are not going to school, young people are not learning skills, and are facing uncertain futures. We won’t be able to avoid crises altogether. But in the new era of huge capital flows and instant communication, the amplitude and severity and cross border reach of crises are magnified. Can we at least make crises less frequent and mitigate the damage? What we should not try to do is retreat to isolation and self-sufficient economies that don’t trade or invest in each other. The simple reason is: we’d all be a lot poorer. That solution is like saying: We don’t like power failures, so let’s all stop using electricity or make sure that everyone has a generator in the backyard, so we don’t have to depend on the power company. What we can strive for is more cautious better informed investing and lending, better early warning systems and ways of heading off deepening trouble once it begins. Better information would help: ⇒ All countries giving complete information on their foreign exchange reserves in as close to real time as possible. ⇒ Better information on budgets and government finances, eg, what real deficits are. ⇒ Companies and banks revealing more clearly what their assets, liabilities and profits are. More effective, better supervised financial systems are essential: ⇒ Prudential regulation of banks. ⇒ More carefully supervised securities markets. ⇒ Accounting standards, enforced. One idea is that a country’s financial systems could be graded by its peers in other countries; somewhat in the manner that universities or hospitals are accredited in the United States and the results published. The concern should not be just with borrowing countries and their institutions, as though making emerging market countries’ financial structures more like the structures in industrial countries would solve all the problems. For every borrower there is a lender. Industrial countries, creditor countries, have to be more sure their banks are managing their risks appropriately, and are prepared to deal with situations when a lot of things go wrong at once. Finally, we must strengthen international financial institutions so they are better able to assess the health of member countries, give timely warnings of trouble ahead, and provide short term assistance to countries that need to make repairs. All of this will take sustained effort to learn from past and build a stronger international financial community for future. The process ought to create some interesting, rewarding jobs and challenges for those with an interest in how the economic world works and how to make it better.
|
board of governors of the federal reserve system
| 1,998 | 11 |
Remarks by Mr. Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System, at the College of Management of the University of Massachusetts in Boston, Massachusetts on 27/10/98.
|
Mr. Ferguson speaks on the future of the financial services sector Remarks by Mr. Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System, at the College of Management of the University of Massachusetts in Boston, Massachusetts on 27/10/98. Some Observations on the Future of the Financial Services Sector and Related Public Policy Issues Let me first thank you for your invitation to speak today at this very topical conference. The future of the financial services sector holds great interest for industry participants as well as bank regulators, given the remarkable changes affecting the industry. Developments in this sector will affect all of us. We are all acutely aware of the trend toward the blurring of lines that have separated the various parts of the financial sector, particularly commercial banking, investment banking, and insurance. Most recently, we are faced with the prospect of the formation of huge financial conglomerates, composed of amalgams of firms that have previously operated within the roughly defined boundaries of their respective industries. In addition, we have witnessed a substantial degree of consolidation through mergers in the commercial banking and other industries. I will begin my remarks today by discussing historical experience with conglomeration in the US, and the underlying conditions necessary for conglomeration to be a successful business strategy. I will then offer some thoughts regarding what the financial services sector may look like in the future. Finally, I will address some of the issues that the Federal Reserve as a supervisor and regulator will need to consider. There are several conclusions that I would like to highlight. The first is that the movement to financial conglomerate may prove to be transitory. Second, basic financial and risk management skills will likely remain the most important determinant of a company’s viability and success. Third, even in a world of financial conglomerates there will be room for smaller, and more nimble, participants. Finally, the supervisory and regulatory structure will need to evolve to meet these new challenges, but regulatory authorities should remain vigilant to carry out our duties, particularly in enforcing the antitrust laws. II. Will Financial Service Conglomerates be Successful? The Historical Context of Conglomeration The concept of the financial conglomerate is receiving a great deal of attention at this time because of the diminishing distinctions across financial industries. However, the success of such conglomerates is certainly not guaranteed, based on historical experience. For an illustration, one need only look back to the conglomerate merger movement in the industrial sector during the 1960s. Large conglomerate firms such as LTV, Gulf and Western, Textron, and Litton Industries became conglomerates through numerous mergers, heralded with great fanfare from the business press. By the 1970s, such proclamations sounded hollow, as many conglomerate firms’ equities were pummeled as their corporate structure proved unsuccessful. This period was followed by a decade of spin-offs as the conglomerates sought to streamline their operations and return to their original areas of expertise - and this reversal of strategy was greeted with considerable enthusiasm by the business press. Much closer to home, we should also recall that the retail financial sector went through a phase of conglomeration about 15 years ago, in the movement toward the “financial supermarket”. The acquisitions of Coldwell Banker and Dean Witter by Sears, Shearson by American Express, Schwab by Bank of America, and Bache by Prudential come to mind. Despite the enthusiasm for the financial supermarket in the early 1980s, by the end of the decade the concept had stagnated, and the press instead reported the benefits of specialization and the provision of niche services. I believe this past experience with business conglomerates in the US should temper our views of the inevitability of today’s financial conglomerates. There are clearly challenges to managing much larger, more diversified firms, as well as benefits. This business strategy is surely not for all financial services firms. Business Conditions Necessary for Financial Conglomerates to Succeed The difficulties associated with operating a financial conglomerate are suggested by the business conditions that are necessary for such a conglomerate to succeed. We have heard, of course, of the potential economies of scale and scope in managing information that may allow large firms to reduce the unit costs of doing business. We are also aware of the potential for cross marketing, where economies of scope on the supply side allow firms to sell multiple products to their customers, and on the demand side allow consumers to conveniently purchase bundles of services. Since a number of conglomerates of the past have been unable to improve upon separate delivery of services - including financial services - it is unclear whether technological or other changes have altered the nature of financial products or managerial capabilities sufficiently to result in significantly improved exploitation of scope economies. The potential benefits of reducing risk through sectoral and geographic diversification are another motivation for conglomeration. I have little doubt that in many cases the potential for diversification gains exists. However, we have heard very little about the expertise needed to merge the risk management practices of previously separate financial firms. While merging risk management practices may be straightforward for firms consolidating within an industry, such a readjustment is likely far more complex for firms originating in different industries. Furthermore, the challenges of managing a merger of diverse firms and the operation of the resulting conglomerate are substantial. The operation of a financial conglomerate requires the ability to integrate the financial activities, research and development investments, pricing decisions, compensation practices, cost containment activities, and cultures of an extremely large and diverse organization. Before any steps toward conglomeration are taken, each potential participant in such a merger must judge whether this extraordinary set of skills is to be found in any managerial team. However, having raised the cautionary flags about the prospects for financial conglomerates, I should add that various considerations suggest that the financial conglomerate format in the future may be successful for at least some firms. From the management perspective, technological developments should help provide more detailed information for monitoring the various activities of the firm and allow the application of more sophisticated risk management models. Past experience with conglomeration may also provide new insights for managing product integration more effectively. On the demand side, retail customers may become more receptive to one-stop shopping as the range of services increases and as new opportunities for accessing these services become available. III. How Might the Financial Services Sector Evolve? The extraordinary changes occurring in the financial sector make projections about how the sector will evolve quite speculative. Nevertheless, it is useful to consider how the future might look as both industry participants and regulators plan to meet that future. I would like to offer some prognostications based both on systematic projections using historical experience and on analogies and informal observations of current trends. Systematic Projections on Consolidation The commercial banking industry, which is the largest single component of the financial sector, has experienced massive consolidation since 1980. For example, the number of banking organizations in the US fell from around 12,300 in 1980 to just under 7,200 by the end of 1997. Not surprisingly, the percentage of banking assets held by the top 10 banking organizations rose from about 20 percent in 1980 to nearly 35 percent in 1997, while the share held by the top 25 increased from about 33 to 53 percent. Despite the large amount of consolidation and increase in banking concentration at the national level, banking concentration within local market areas (MSAs and non-MSA counties) has, on average, hardly changed over the past decade. For example, the average Herfindahl-Hirschman Index (HHI) in MSAs decreased from 1990 to 1949 between 1985 and 1997, while in non-MSA counties it decreased from 4357 to 4114 (thrifts excluded). The stability of average local market concentration is noteworthy because research suggests that competition for retail customers takes place substantially at the local market level, and concentration is a determinant of competition. Based on these trends and the fact that tremendous new opportunities for bank mergers have been created by the removal of restrictions on interstate banking under the RiegleNeal Interstate Banking and Branching Efficiency Act of 1994, I would expect more consolidation to occur in banking. Based on historical experience, some Fed economists have projected that in a decade or so about 3,000-to-4,000 banking organizations will remain. Importantly, most of these firms would be smaller organizations. In addition to commercial banking, other financial service sectors have also experienced movements toward consolidation. Insurance companies and securities firms have all experienced merger waves in recent years, though not to the extent of commercial banking. This more moderate pace of consolidation in nonbank financial industries is likely due to the absence of preexisting restrictions on geographic expansion. Like banking, insurance underwriting is characterized by a broad size distribution of firms, from several large national insurers to much smaller local or technically specialized firms. Investment banking is more heavily concentrated, due to the greater geographic expanse of the markets for their services, and due to the considerable benefits of portfolio diversification. Extending current patterns into the future, we can only expect a continuation of the trend towards consolidation. Informal Projections on Conglomeration As I noted earlier, in addition to a continuing consolidation in banking and elsewhere in the financial sector, there is also a tendency toward increased conglomeration. Given the prospects for legislation expanding the product lines available to banks, I would not be surprised at a wave of cross-industry mergers within the financial sector, particularly among the largest firms, that results in the formation of financial conglomerates. In order to consider the form of the financial sector in the future, I believe it is very important to distinguish between retail financial services (i.e., those for households and small businesses) and wholesale or large-scale corporate financial services. The customer bases of these two sectors differ dramatically, as do many of the products used. Despite their differences, both the retail and wholesale sectors have moved to expand their product offerings. In retail financial services, the desire by banks to expand their product offerings is both a response to new products developed by other industries (for example, the emergence of annuities in the insurance industry), as well as a response to customers’ substitution of other products (such as mutual funds) for traditional bank products (such as traditional savings accounts). At wholesale firms, we have witnessed the push into securities underwriting and the establishment of offices throughout the world in an effort to respond to the financial needs of large corporate customers. Because of the very large scale required to serve a wholesale market consisting of a relatively small number of sizable customers, relatively few very large firms will likely remain active in the wholesale portion of the financial sector. Of course, the large wholesale institutions may operate in the retail portion as well, although the latter activity does not necessitate the extraordinary size of these firms. In addition, there are clearly boutique firms that achieve enviable results by serving just one need, say M&A advice, for large businesses. The existence of both retail and wholesale institutions will have a direct bearing on the future of financial services. In contrast to wholesale institutions, retail financial institutions are able to operate profitably on a small scale. Given the continuing strong preference of many retail customers to deal with local offices, especially for many services of depository institutions, small retail financial institutions will likely be able to serve the needs of individuals and small businesses in local markets. Such niche players would play an important role in a retail financial sector composed of perhaps several thousand relatively small retail financial institutions, and a much smaller number of large financial firms that offer wholesale products, or both wholesale and retail products. The large firms that offer retail services would likely operate offices throughout the country. This scenario would not be unlike that which we observe quite frequently in the nonfinancial retail sector. Restaurants, department stores, convenience stores and supermarkets are examples of locally based markets with a strong presence of large national or regional firms operating alongside a large number of small local sellers. IV. What Do Conglomeration and Consolidation Mean for the Federal Reserve? Now I would like to consider some of the issues that are raised by the evolution of the financial services sector for the Federal Reserve. Clearly the Federal Reserve must adapt along with the industry it supervises and regulates. Reducing Moral Hazard through Increased Market Discipline A key concern, which is a challenge internationally as well as in domestic supervision, is the well-known “moral-hazard” problem. Moral hazard occurs when a firm takes on greater risk than is prudent, because it does not bear the full cost of adopting that risk position. Instead, the cost of this risk is borne by the financial safety net, which in the US banking system consists of access to federal deposit insurance, the Fed’s discount window, and Federal Reserve payments system guarantees. While this safety net in my judgment will surely remain firmly in place, incentive-compatible policies should be developed and applied that shift more of the cost of risk-bearing onto the firms themselves. Such policies would seek to increase and improve the role of “market discipline” in constraining bank risk taking. One widely discussed idea for increasing market discipline is to require a bank to issue a minimum amount of subordinated debt to non-related parties. The basic idea is to create a set of uninsured bank liability holders who will have risk preferences similar to those of bank supervisors, and who would thus discipline bank risk taking to a similar degree. In addition, changes in the price of a bank’s subordinated debt might prove to be a useful signal of the market’s evaluation of a bank’s financial condition. Such signals could perhaps be used by supervisors as an aid in identifying problem banks. While a policy of “mandatory subordinated debt” is not without its problems and implementation difficulties, it is certainly worthy of serious discussion. A second way that market discipline could be encouraged is through financial reporting and transparency guidelines, which provide the market with the information necessary to make informed transactions. This improved information is essential for market discipline to function. Greater transparency by itself will not alleviate the moral hazard problem inherent to insured deposits, since fully informed investors may be willing to make deposits in a bank with an overly risky portfolio, as long as their deposits are insured. However, transparency can reduce the need for some types of direct regulation in uninsured financial activities. If the holdings of financial institutions are apparent to the market, the institutions will adjust their risk positions optimally to maintain the ability to attract capital. Improved disclosure by domestic as well as international institutions would increase the information that allows investors to assess the risk they bear. Communicating with Other Regulators If the financial services sector moves further in the direction of conglomeration, the Federal Reserve will experience an ever greater need for communication with regulators of different financial industries. Regulators of the banking industry will need to coordinate with agencies governing the securities and insurance industries. Communication with insurance regulators will present a particular challenge, since in the US insurance is regulated exclusively at the state level. Furthermore, international communication among central banks and other financial regulators is already essential. This need can only become greater as markets for many financial services continue to extend across national borders. Indeed, the need for crisis intervention could be reduced through preventive measures that include ongoing communication and coordinated improvements in supervisory methods. Ensuring Competition Also important is the role of encouraging competition through implementation of the Federal Reserve’s bank merger policy. This role is particularly important at a time when banking and the financial sector in general are experiencing an unprecedented merger movement that is radically restructuring these industries. Before discussing newer analyses that we might be called upon to perform, let me emphasize that I think that we must continue to be extremely vigilant to protect local market competition. Applicants should remember that we measure impacts on local market structure considering both HHI and concentration ratios. Transactions satisfying one of those measures may not satisfy the other measure. Small increases in market share as measured by HHI may mask the high market share of the dominant competitor. In those cases, I would expect that mitigating factors, even for small increases in market share, must be extremely strong to gain approval. The task of enforcing bank merger policy will become potentially more challenging if financial conglomeration occurs, since the analysis of competition relies on the correct definition of the relevant geographic and product markets. Thus, increasing numbers of financial conglomerates that sell traditional retail banking services along with insurance and securities will make the analysis of competitive effects of mergers more difficult. In addition, it will remain important to distinguish between markets for large-scale wholesale products and markets for household- and small-business-oriented retail banking products. We will also need to consider the effects that networks, such as ATM and point-of-sale networks, and other forms of electronic banking have on competition for financial services. Finally, as financial relationships become increasingly complex, we must work to reduce market inertia that could lessen competition by restricting customers’ ability to choose freely among providers of financial services. Domestic competition is particularly important not only because it increases the welfare of our fellow citizens, but also because other reforms, such as those based on market discipline, probably work best in a competitive market. Also, domestic competition is one of the most powerful forces that gives rise to international competitiveness. Reducing Systemic Risk through Improved Supervision Finally, the Federal Reserve must continually improve the bank supervisory process in order to help manage risk that could be introduced into the financial system by the changes occurring in the financial sector. The effects of consolidation and conglomeration on systemic risk cannot be ignored. If risk is not managed properly within the consolidated firm, the repercussions could extend beyond a single institution and well into the financial sector. Even if the moral hazard problem could be largely solved, so that financial firms bear the individual costs of the risks they adopt, these firms may fail to internalize the costs borne by the entire financial system when their portfolio decisions result in bad financial outcomes. The risk of such damage can be held in check only by sound supervisory practices. I do not have time to do more than quickly mention some of the important efforts that the Federal Reserve is pursuing in order to improve bank supervision. Much of our effort, along with that of other bank supervisors, is focused on improving our ability to monitor and supervise the risk management practices of banks and banking organizations. Closely related work is aimed at devising capital standards that more accurately reflect the realities of today’s banking world. Through it all, I would emphasize that old-fashioned supervision, the core of which is the on-site examination, remains central to our supervisory strategy. Improved market discipline, increased communication with other supervisors, and enhanced stability through greater diversification are necessary. But improved supervision is also critical, and sometimes there is simply no substitute for a well-trained and highly motivated examiner “on the ground.” V. Conclusion In closing, I would like to reemphasize the importance of the basic skills that remain centrally important in the financial services sector. Regardless of the extent and the nature of conglomeration, technological development, and new financial instruments, good fundamental business practices will remain essential: credit underwriting and risk management skills, cost control, and appropriate pricing practices. While technological improvements are an aid in nearly all these areas, only sound management practices will allow industry participants to take advantage of any advances in technology. At the same time, fundamental principles of supervision and regulation must be applied, including sound credit standards and on-site examinations, to ensure the safety and soundness of the financial sector. Finally, regulators should implement antitrust policy in a manner that will ensure competitive prices and services as the financial sector evolves and consolidates. Despite the importance of vigorous supervision and regulation, I and my colleagues on the Board are quite sensitive to the fact that it is not our role to judge the wisdom of management decisions and business strategies beyond ensuring that public policy standards regarding supervision and regulation are met. I look forward to observing and participating in this sector’s evolution into the next century. I thank you once again for the opportunity to discuss these issues with you today.
|
board of governors of the federal reserve system
| 1,998 | 11 |
Remarks by Mr. Edward W. Kelley, Jr., a member of the Board of Governors of the US Federal Reserve System, at the Annual Economic Symposium at Houston Baptist University on 29/10/98.
|
Mr. Kelley gives an economic assessment of the countdown to Y2K Remarks by Mr. Edward W. Kelley, Jr., a member of the Board of Governors of the US Federal Reserve System, at the Annual Economic Symposium at Houston Baptist University on 29/10/98. I am pleased to be back with you today to participate in this annual economic symposium hosted by the Houston Baptist University and the Federal Reserve Bank of Dallas. These are challenging times for economic policymakers, and there are many issues currently facing our nation’s central bank. One of them is the Year 2000 computer problem, and that is our subject for today. Much has been written on this topic, and I am sure that everyone here is familiar with the basic issue - specifically, that information generated by computers may be inaccurate, or that computers and other electronic systems may malfunction because they cannot correctly process Year 2000 dates. I will dwell no further on the nature of the problem itself, but rather attempt to focus on the likely economic impact of this problem. The economic stakes are very large, and the spectrum of possible outcomes very broad, ranging from minimal to extremely serious. For the truth of the matter is that this episode is unique: We really have had no previous experience with a challenge of this sort to give us reliable guideposts. Although the lack of a precedent may be unnerving, that certainly does not free us from the obligation to attempt to analyze how the millennium bug is affecting and will affect the economy. This economic puzzle has many complex pieces - some of them quite inscrutable before the event - and my task this morning is to assemble for you the tidbits of information that we do have into a coherent picture of where the Y2K problem appears likely to take us. Let me turn first to the reasons why the Year 2000 problem is so difficult to manage. Then I shall discuss the actions that are being taken by government and businesses to deal with the millennium bug and the effects these measures are having on economic activity at the present time. Finally, I shall turn - not without some trepidation, I might add - to the spectrum of plausible outcomes for the Year 2000, as I see it. As you shall shortly hear, I believe the Y2K pessimists have not fully recognized the attention that is being given to this problem and the significant progress that has already been made. Given what we know today, I am cautiously optimistic that the millennium bug will not cause major economic disruptions when it bites. The Difficulty of Managing the Y2K Problem Why has it been so difficult for people to come to terms with the Year 2000 problem? At the most basic level of any organization - be it public or private, large or small - the Y2K problem was all too easy to ignore. It is a hidden threat, cloaked in the arcane language of computer programs and in embedded microchips. As such, it was difficult at first for senior management in the corporate world and the public sector to recognize the serious nature of the problem. This was compounded by the fact that the costs and benefits of the situation for an individual operation are not easily quantified nor are they attractive. But after an initial period of denial, the majority of US business leaders now have recognized the problem and are taking action to correct it. Nonetheless, given the pervasiveness of the problems involved, I suspect that even the most thoroughly prepared organizations are worried that something important might be missed. Indeed, many organizations are developing elaborate workarounds and other contingency plans as insurance against Y2K disruptions. Thus, uncertainty about the ultimate effectiveness of Y2K remediation programs already is affecting corporate investment and production plans and obviously will be with us until at least January 2000. The second feature of the millennium bug that makes it so difficult to analyze is the interrelated character of many computer systems. An individual company may be satisfied that it has done all it can to fix its own systems, but it still may feel very vulnerable to the actions taken by its suppliers and customers. In an environment where just-in-time inventory systems and electronic data interchanges (EDI) have linked economic activities very closely together, one firm’s failure has the potential to ripple through significant segments of the chain of production, services, and distribution. Thus, greater coordination of Y2K remediation activities would have benefits for everyone. But it is clearly impossible to coordinate the Y2K activities of millions of individual establishments. To help fill this void, numerous organizations have emerged as clearinghouses of information, and I have high hopes for efforts such as “National Y2K Action Week”, which was just sponsored by the Small Business Administration (for the President’s Council on the Year 2000), together with a number of private groups. Other institutions are functioning as vehicles for system testing. Witness this past summer’s inception of Federal Reserve payment system testing with its depository institution customers, and similar work by the Securities Industry Association. Bank supervisors, including the Federal Reserve, are holding their banks accountable for the effectiveness of their Y2K efforts, and I can assure you that there will be consequences for banks that do not fulfill their obligations. But many organizations are on their own to test their critical systems with their key suppliers and customers. Because this situation is both complex and fragmented, it is a very difficult task to quantify the aggregate costs of Y2K remediation. Similarly and perhaps more importantly, we also have no national scorecard on how effective our economy is being in our remediation efforts. Under these circumstances, it is not hard to understand why the millennium bug is viewed as such an unpredictable phenomenon, and why it has attracted so much gloom and doom commentary. What is Being Done? So, what is being done? The short answer is a lot is being done. Let me review with you our understanding of the status of efforts by the private sector, government entities, and the world community to deal with the Year 2000 problem. As far as the private sector is concerned, efforts to deal with the millennium bug have been intensifying. In my testimony before Congress in April, I suggested that the private sector might spend approximately $50 billion over the next two years to tackle the Y2K problem. This figure was based on a reading of the available corporate filings with the SEC as well as our own guestimates for firms that either are private, did not discuss Y2K expenditures in their 10-K reports, or stated that their Y2K programs were not having a “material” influence on their bottom line. We have updated our research using the latest quarterly reports, and we were pleased to find a larger percentage of firms discussing the costs of their Year 2000 strategies, and I am confident these funds are being used effectively. While our estimate of the “$50 billion bug” seems still to be reasonable, I fully expect to see this figure move upward over time. I also perceive that the tools available to companies to address Y2K issues have increased substantially. Over the past six months, most major computer hardware and software companies have released documentation of the Y2K readiness of their products on their web sites. Similarly, most of the major computer publications now have elaborate “how to” guides on their web pages that will aid consumers and small businesses in their efforts to make their systems Year 2000 compliant. Commercial software producers also have been busy, and new software products are becoming available to aid programmers in repairing code. I hope and believe people are availing themselves of these new resources. To paraphrase an old adage, “we have brought the Y2K horse to water, and he appears to be drinking!” As far as financial institutions are concerned, I am encouraged by the progress that has been made over the past year, and there is every reason to be confident that our financial system will be ready. Based on a review of all depository institutions completed by federal banking regulatory agencies, the vast majority are making satisfactory progress in their Year 2000 planning and readiness efforts. About four percent were rated “needs improvement” and fewer than one percent were rated “unsatisfactory”. In these cases, the agencies have initiated intensive supervisory follow up, including restrictions on expansionary activities by Year 2000 deficient organizations. While we can be confident institutions are addressing the problem, it is important to recognize that regulators cannot be responsible for ensuring or guaranteeing the Year 2000 readiness and viability of the banking organizations that we supervise. Rather, the boards of directors and senior management of banks and other financial institutions must be responsible for ensuring the institutions they manage are able to provide high quality and continuous service in January 2000. The Federal Reserve System has itself made significant progress on Y2K issues, meeting all of the goals that we have set for ourselves. In addition to completing an initial review of the Y2K readiness of all banks subject to our supervisory authority, we have renovated our own mission-critical applications, nearly completed work on all others, and are close to completion of our internal testing. As mentioned, we have opened our mission-critical systems to customers for testing with us and have progressed significantly in our contingency planning efforts. This has required much hard work on the part of many people, but it is paying off in visible ways. As in the private sector, activity to fix computer systems maintained by the federal government is intensifying. Progress has been made in many areas, but the President’s Council on the Year 2000 agrees that much work still needs to be done. Reviews of federal Y2K programs have highlighted needed areas of improvement, and the Congress has budgeted about $5½ billion for Y2K fixes. Legislation also has been enacted that would facilitate the sharing of Y2K information among businesses and clarify the legal liability of reporting Y2K readiness information. All of these are positive developments. With the notable exception of the recent dress rehearsal in Lubbock, I must admit that far less is known about the effectiveness of the Y2K preparations by state and local governments. Attention often is focused on high-profile systems such as the nation’s air traffic control computers or the electric power grid, but there are many smaller, yet quite critical, electronic systems maintained by states and municipalities that also are very vulnerable. For example, innumerable vitally important computer-controlled utility systems are operated by local government units. And, as any Washington commuter will tell you, one or two malfunctioning traffic signals can cause serious congestion, confusion, and delay, and the complete breakdown of traffic management systems likely would cause near total gridlock. To try to get a handle on Y2K remediation activity at the state level, we surveyed the web pages of all fifty states and the District of Columbia and Puerto Rico. As best we can tell, the states that reported Y2K programs are budgeting $1 to $2 billion for this purpose. However, five states had no reference to Y2K preparedness and 23 states did not cite the cost estimates for their programs. Thus, based on this and other anecdotal evidence I have seen, I suspect that much work still needs to be done in this area as well. On the international level, there is both good news and bad. The governments of various industrialized nations have stepped up their own internal Y2K awareness and remediation programs over the past six months, and international cooperation is intensifying through efforts such as the Joint Year 2000 Council, chaired by my colleague Federal Reserve Governor Roger Ferguson. Most large multinational corporations report they are well along in their preparations worldwide. However, the conversion to the euro and world financial troubles obviously are deflecting all too much attention away from Year 2000 issues, and I worry that time will simply run out for some countries - particularly in the developing world. As a result, risks exit for some level of disruption to international trade and capital flows. How are Year 2000 Preparations Affecting the Economy? Corporate efforts to deal with the Y2K problem are affecting economic activity in a variety of ways. On the positive side, an important element in some Y2K programs is the accelerated replacement of aging computer systems with modern, state-of-the-art hardware and software. Such capital expenditures should raise the level of productivity in those enterprises, and, in general, the need to address the Y2K problem has increased the awareness on the part of senior executives of the complexity and importance of managing corporate information technology resources. The increased replacement demand also has contributed to the spectacular recent growth in this country’s computer hardware and software industries. But, ultimately, we are largely shifting the timing of these investment expenditures: Today’s added growth is likely “borrowed” from spending at some time in the future. And, while accelerating some systems investments, many institutions will “freeze” their systems in the middle of 1999 - effectively forgoing the installation of major new hardware and software systems as we approach the millennium. This certainly will also influence spending on technology - shifting some of it out of 1999 and into 2000. While a shift in the timing of investment spending may help to solve the Year 2000 problem for some firms, it certainly will not be sufficient. Most organizations do not have the option of simply scrapping their existing computer systems and replacing them with shiny new “turn-key” hardware and software. To one degree or another, we all rely on elaborate proprietary software systems that have been developed over many years, and these programs must be debugged by a skilled programmer. This obviously is a very labor-intensive and time-consuming process, and organizations have had to boost their IT staffs to carry out this work. The good news is that many got an early start on this problem and are now well along on their repair and testing programs. The bad news is that there is no corresponding increase in the firm’s marketable output, and this lowers the overall level of productivity in the enterprise, boosts its costs, and reduces its profitability. One area in which uncertainty about Y2K readiness is likely to have noticeable effects on economic activity in 1999 is in the management of inventories. As the millennium approaches, I expect businesses will want to hold larger inventories of goods as insurance against Y2K-related supply disruptions. Such a shift from “just-in-time” inventory management to a “just-in-case” posture is likely to prompt an increase in orders and production during 1999. These stocks subsequently would be run off in the first half of 2000. We at the Fed, for example, will do precisely that in our management of the production of new currency. While Year 2000 preparation efforts may give a temporary boost to economic activity in some sectors, the net effect on the aggregate economy probably is negative. Other than the obviously very valuable ability to maintain its operations across the millennium, few quantifiable benefits accrue to the firm - and overall productivity gains are reduced by the extra hours devoted to reprogramming and testing. Conservative estimates suggest that the net effect of Y2K remediation efforts might shave a tenth or two a year off the growth of our nation’s overall labor productivity, and a more substantial effect is possible if some of the larger estimates of Y2K costs turn out to be accurate. The effects on real gross domestic product are likely to be somewhat smaller than this but could still total a tenth of a percentage point or so a year over the next two years. What is the Spectrum of Plausible Outcomes for Economic Activity in 2000? As we have discussed, a great deal of work either is planned or is under way to deal with the Year 2000 problem. But what if something slips through the cracks, and we experience the failure of “mission critical” systems? How will a computer failure in one area of the economy affect the ability of others to continue to operate smoothly? How severe could be the consequences of Y2K problems emanating from abroad? The number of possible scenarios of this type is endless, and today no one can say with confidence how severe any Y2K disruptions could be or how a failure in one sector would influence activity in others. That said, let me now turn to a discussion of the spectrum of plausible outcomes for economic activity in the Year 2000. What will happen as the millennium rolls over? A few economists already are suggesting that Y2K-related disruptions will induce a deep recession in 2000. That probably is a stretch, but I do not think we will escape unaffected. I anticipate that there will be isolated production problems and disruptions to commerce, and perhaps some public services, that will reduce the rate of GDP growth early in 2000. Certainly a mild inventory cycle seems very likely to develop. But, just like the shocks to our nation’s physical infrastructure that occur periodically, I would expect the Y2K shock to our information and electronic control infrastructure is most likely to be short-lived and fully reversed. We have many examples of how economic activity has been affected by disruptions to the physical infrastructure of this country. Although the Y2K problem clearly is unique, some of these disruptions to our physical infrastructure may be useful in organizing our thinking about the consequences of short-lived interruptions in our information infrastructure. In early 1996, a major winter storm paralyzed large portions of the country. Commerce ground to a halt for up to a week in some areas but activity bounced back rapidly once the roads were cleared again. Although individual firms and households were adversely affected by these disruptions, in the aggregate, the economy quickly recovered most of the output lost due to the storm. In this instance, the shock to our physical infrastructure was transitory in nature, and, critically, the recovery process was under way before any adverse “feedback” effects were produced. Last year’s strike by workers at the United Parcel Service is a second example. UPS is a major player in the package delivery industry in this country, and the strike disrupted the shipping patterns of many businesses. Some sales were lost, but in most instances alternative shipping services were found for high-priority packages. Some businesses were hurt by the strike, but its effect on economic activity was small in the aggregate. Obviously an important element in any forecast for the Year 2000 is the degree to which the failure of the computer system at one institution causes ripple effects on the systems of others. If the disruptions that occur are not isolated events, as I have assumed, but spread across key sectors of the economy by interacting with each other, then an outright decline in real GDP in the first quarter of 2000 could be a plausible outcome. The more dire of the Y2K scenarios would involve, among other things, a perpetuation and intensification of these feedback effects. In such an event, production disruptions could turn out to be a national or international phenomenon and could spread from one industry to another. Under these circumstances, the decline in economic activity would prove to be longer lasting, and a recession - typically defined as a decline in real gross domestic product in two consecutive quarters - could ensue. But let me quickly stress that I do not think this recession scenario has a very high probability. It is possible, but a lot of things have to go wrong for it to occur, and much is being done to prevent its occurrence. What can Monetary Policy Do About the Millenium Bug? What can monetary policy do to offset any Y2K disruptions? The truthful answer is “not much”. Just as we were not able to plow the streets in 1996 or deliver packages in 1997, the central bank will be unable to reprogram the nation’s computers for the year 2000. The Y2K problem is primarily an issue affecting the aggregate “supply” side of the economy, whereas the Federal Reserve’s monetary policy works mainly on aggregate “demand”. We all understand how creating more money and lowering the level of short-term interest rates gives a boost to interestsensitive sectors (such as homebuilding), but these tools are unlikely to be very effective in generating more Y2K remediation efforts or accelerating the recovery process if someone experiences some type of Year 2000 disruption. We will, of course, be ready if people want to hold more cash on New Year’s Eve 1999, and we will be prepared to lend whatever sums may be needed to financial institutions through the discount window under appropriate circumstances or to provide needed reserves to the banking system. And, if a serious Y2K disruption should have significant feedback effects on aggregate demand - as I outlined in the recession scenario - there obviously would be a role for the Federal Reserve to play in countering the downturn. But there is nothing monetary policy can do to offset the direct effects of a Y2K disruption. Conclusion In summary, as I stated at the outset of my remarks, I am cautiously optimistic that the United States will weather the Year 2000 storm without major disruptions to economic activity. Some of the more frightening scenarios are not without a certain plausibility, if this challenge were being ignored. But it is not being ignored - as this symposium today clearly illustrates. An enormous amount of work is being done in anticipation of the rollover of the millennium. As the world’s largest economy, the heaviest burden of preparation falls on the United States. But it is truly a worldwide issue, and to the extent some are not adequately prepared and experience breakdowns of unforeseeable dimension, we shall all be affected accordingly. We at the Federal Reserve intend to do our utmost, and we hope and trust others will do likewise.
|
board of governors of the federal reserve system
| 1,998 | 11 |
Remarks by Mr Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System at the National Economic Association Forum in New York on 3/1/99.
|
Mr Ferguson reviews last year’s economic performance in the United States and comments on topics related to macroeconomics and monetary policy Remarks by Mr Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System at the National Economic Association Forum in New York on 3/1/99. Looking Forward, Looking Back: Thoughts on the Start of a New Year Thank you very much for giving me the chance to join you to start the New Year. While most of our fellow citizens are relaxing with family and friends, it is nice to know that the economic braintrust of the United States is busily and happily at work here in New York. I know that means the science of economics will be off to a fast start again this year, and in my remarks today I shall raise several important issues that I think warrant greater attention by our profession. The end of one year and start of another is a natural time to focus on accomplishments and also to create a list of things to do or understand better. In the spirit of old year endings and new year beginnings, I would like to review last year’s economic performance, highlight some prospects for the upcoming year’s economic performance, and, finally, raise some topics related to the underpinnings of macroeconomics and monetary policy. Of course, the views that I am about to express are my own, and do not reflect those of the FOMC nor those of the Board of Governors. Last Year’s Economic Performance The last twelve months have been a most challenging time in general, and particularly from the standpoint of monetary policy. We started the first half of the year with a focus on the cross winds of strength and international weakness that seemed to be buffeting the United States economy. Domestic demand was particularly strong, led by consumption expenditures, which grew at a 6 percent annual rate in both the first and second quarters of the year. However, consumption was not the only engine driving the spectacular performance of the first half of 1998. Business fixed investment, paced by spending on producers’ durable equipment, rose at more than a 22 percent annual rate in the first quarter and by nearly a 13 percent annual rate in the second quarter. Additionally, the housing sector expanded at a robust 15 percent annual rate or more in both the first and second quarters. These latter two sectors were undoubtedly supported by relatively accommodative financial conditions. During much of 1997, long-term interest rates had trended downward, and by early 1998 rates on fixed-rate mortgages were close to their lowest levels in twenty-five years. Similarly, the rise in equity prices over much of 1997 and the first half of 1998 provided ample incentive for consumption by raising the value of household assets and improving the general sense of financial well-being of our citizens, as well as lowering the cost of capital faced by businesses. This domestically driven good news was counterbalanced by a sense of foreboding from Asia. During the first half of 1998, the net export drag, due in large part to the turmoil in Asia, subtracted about 2 percentage points from GDP growth. The trade deficit grew to well over $200 billion, again at an annual rate. For the March through July FOMC meetings last year, the Committee maintained a bias toward raising interest rates, but did not actually take action. The sense of the FOMC was that the tightness in labor markets and general growth of demand would be likely to create upward –2– pressures on wage growth and eventually on the rate of inflation. However, uncertainty regarding the degree and timing of impact from the Asian crises, and the fact that inflation was actually subdued, allowed us to adopt a “wait and see” posture. The second half of 1998 provided a very different configuration of events that, as you know, created the need for some monetary policy action. Interestingly, the catalyst for this monetary policy decision was not directly Asia or a slowing economy from net export drag. Rather, it was the indirect impact of Asia, working through a Russian debt moratorium and ruble devaluation, which provided the impetus for an easing in U.S. short-term interest rates. By the end of August, financial markets in the United States had become quite unsettled, with an open and obvious flight to quality and liquidity creating a risk of undue credit tightening for private sector borrowers. As we said in October in an inter-meeting rate reduction that brought the decline in the federal funds rate target to 50 basis points, “Growing caution by lenders and unsettled conditions in financial markets more generally are likely to be restraining aggregate demand in the future.” The flight to liquidity, which was evidenced in unusually wide spreads between U.S. government securities of similar risk characteristics, and the flight to quality, which showed through in unusually wide spreads between securities of differing risk characteristics, were the defining factors for much of August, September and October. More recently, we have seen some signs of easing of financial market stresses and some unwinding of the associated flight to quality. These moves have been encouraged, in part, by policy actions of many central banks around the world, and, in part, by a stronger sense that the world’s major industrial countries, the G7, are beginning to address a number of the factors that have contributed to uncertainty and volatility in financial markets around the globe. Outlook for 1999 At present the consensus view seems to be that growth in the second half of 1998 was probably a bit slower than the rapid pace of the first half, but that real GDP growth still averaged well above 3 percent on a fourth quarter to fourth quarter basis. The consensus is that this year’s real growth will abate further, perhaps to about 2 percent, which would eventually produce a “soft landing” of sustained growth near the economy’s potential and low inflation. It’s not hard to imagine risks to both sides of this scenario, especially with financial markets still not having fully settled down, and I can see that 1999 will require a continued high level of vigilance for policy makers. One of the factors that I find most interesting as we start this year is the likely impact of the upcoming century date change. As you may know, I am Chairman of the Joint Year 2000 Council, which is a group of financial regulators from around the world that has spent the last nine months focusing on the Year 2000 computer problem. In an international context, the Year 2000 is likely to have differing impacts across different regions. Here in the United States, my colleague Governor Mike Kelley has stated that we are likely to see some disruptions to economic activity because of Year 2000 problems but the effects are likely to be temporary and quickly reversed. I believe businesses already are planning to manage their inventories with Year 2000 considerations in mind. Later this year, many firms will want to hold larger inventories of goods as insurance against Year-2000-related supply disruptions and these are likely to run off those stocks in the first half of 2000. Households concerned about the viability of some payment mechanisms could well desire to hold more cash, and both the Federal Reserve and depository institutions are preparing for this contingency. Overseas, Europe is about to convert to the euro, and we shall see tomorrow how successful their preparations have been. It is now important for senior management of major European institutions to turn their attention to preparations for the Year 2000. In Asia too, firms and –3– markets should focus energy on preparing for Year 2000. The events of the last eighteen months have probably distracted their attention from this problem. However, as we get closer to the end of this year, market participants will require more information on the Year 2000 preparations of counterparties. Those that are not prepared, or do not disclose their state of preparedness, may find that credit is harder to obtain. It is in everyone’s interest that we not become complacent as we face this last twelve months before the start of the new millennium. Observations and Open Issues From Recent Experiences Against this backdrop, I want to raise four issues important to the conduct of monetary policy for your consideration and study. The first two issues involve the supply side of the economy and grow out of the unusual conjuncture in recent years of rapid growth and high resource utilization with low, if not declining, inflation. One is the proper measurement of resource tightness. The two standard measures of resource tightness, capacity utilization in manufacturing and the rate of unemployment, have historically moved fairly closely together over the cycle. However, they have diverged in the past several years, in part as the surge in investment has helped to lower capacity utilization in the manufacturing sector while labor markets have become tighter and tighter. We need a better understanding of the causes of this divergence, and, if it persists, a clearer sense of which measure of resource utilization best foreshadows the emergence of price pressures. A second, and related, issue is whether the nature of “capacity” has changed. Many observers have argued that in the 1940s and 1950s manufacturing capacity was more normally characterized by large-scaled units of fixed machinery, such as blast furnaces and assembly lines. This capacity took long lead times to manufacture, test and install, so available capacity was easier to measure and slower to change. Therefore, high levels of capacity utilization were good predictors of resource tightness that was likely to translate into pricing pressure. Now, we hear, capacity in manufacturing is more technology intensive and can be adjusted more easily to reflect supply and demand conditions. This relatively “elastic” supply of manufacturing capacity implies that capacity utilization may not become “tight” by historical standards, and capacity utilization is therefore a less certain early warning signal of potential pricing pressure. I have seen no proof of the assertion that the nature of manufacturing capacity has changed, although the experience of the last several years suggests that this might be so. Any research that you can bring forth on this issue would be very beneficial. A third issue on my mind is valuations in equity markets, the role they should play in policy making, and whether old relationships have changed. Many observers have asked if I think that the Federal Reserve can or should have a fixed view on the proper level of equity markets. For me the answer is that the Fed cannot target specific levels in equity markets. Equity prices are set by the give and take of supply and demand, with participants buying and selling based on their own information. Investors can and should be influenced by several factors, including expectations of corporate earnings, attractiveness of alternative investments, both domestic and international, differing valuations of underlying assets, and differing appetite for “ownership” risk as opposed to “creditor” risk. I believe that the Federal Reserve’s tools, primarily short-term interest rates, are too blunt to attempt to achieve specific levels of stock market valuations. Nor do I believe that policy makers should necessarily attempt to put their judgments of correct values above those of the market. –4– However, equity markets send important signals to policy makers and have spillover effects into the real economy. As you know, economists often speak of the “wealth effect,” and econometric modeling indicates that consumers ultimately tend to spend about 2-to-4 percent of incremental wealth. In addition, consumer sentiment is tied to feelings of financial well being. Through both of these channels, the so-called wealth effect and the more general impact on consumer sentiment, equity valuations can and do have an impact on consumption and on macroeconomic performance. Additionally, equity markets are an important source of investment capital, and valuations in the stock market are one determinant of the cost of capital for businesses. Therefore, equity prices have an impact on business fixed investment, a major driver of our economy. Finally, equity markets are of interest to policy makers because we have a responsibility for macro-stability. We have seen in other economies that bubbles and busts in financial markets can create unsettled conditions that affect real economic activity. Therefore, maintaining healthy market conditions are of concern to policy makers. The questions I have with respect to equity markets go to the issue of valuation and the wealth effect. Economists propose numerous approaches to determining the “correct” level of equity prices. One such approach compares equity market valuations (namely earnings/price ratios) to the return on fixed income securities, generally the ten-year U.S. Treasury bond. But many observers have suggested that this measure of “correct” stock market valuation may no longer be accurate. Some suggest that the nature of equity markets has changed, with the introduction of new instruments that allow for better management or sharing of risks. Therefore, these observers would assert, lower risk premia over risk-free returns are appropriate, and old relationships between E/P ratios and the return on Treasury instruments no longer hold. Others would argue that in this world of service firms and high-tech companies, and knowledge-intensive industries, our accounting treatments do not accurately reflect underlying asset values, and therefore measures of “correct” stock valuations do not capture economic reality that market participants see. These assertions need scientific investigation that perhaps someone in this room can perform. With respect to the “wealth effect,” 1998 was unusual in having several months with a negative published savings rate. The existence and persistence of this phenomenon was widely reported in the financial press. Though the focus on this “negativity” of the savings rate is misplaced, in light of measurement issues, what is evident is that the savings rate is low and has declined substantially. This experience calls into question whether we have accurately measured the feedback of wealth on consumption. When we look back on this period, we may determine that there are periods in which the “wealth effect” is noticeably greater than the 2-to-4 percent that appears to be the norm. Economic research also might focus some attention on whether increases in wealth have a long lag time in reaching full impact on consumption, and the degree to which “long-term” wealth gains are treated differently from “shorter term” gains that might be thought to be more transitory. Finally, I will turn to the role of international developments and exchange rates in policy. Our mandate gives priority to price stability and maximum sustainable employment, which I think are the right elements for us to consider in policy deliberations. Therefore, I believe that international economic considerations, like stock market valuations, should receive only indirect focus. We are not a closed economy, so we should recognize that our actions have effects on other economies and that those effects might, in turn, spill back into the United States. However, we are not the world’s central bank, and we cannot manage aggregate demand in all parts of the globe. –5– However, exchange rates are clearly one transmission mechanism for monetary policy. Lower interest rates lead to a lower exchange rate for the U.S. dollar, making our products more competitive relative to those of other countries. This promotes exports, damps imports and leads to more rapid growth. In addition to those effects on real economic activity, exchange rates have an influence on inflation. As we saw last year, a strengthening U.S. dollar leads to lower domestic inflation by reducing the price of imports and restraining pricing power in importcompeting industries. One might argue that a change in the foreign exchange value of the dollar should have a one-time effect on the price level, rather than a continuing impact on the inflation rate. But price level changes do become imbedded in the inflation rate when they alter inflation expectations. It seems likely to me that the persistent weakness in import prices of recent years has helped to damp inflation expectation and hence have had a persisting damping effect on the inflation rate. Recognizing the importance of exchange rate movements, there has also been much discussion and speculation of coordinated interest rate moves. Again, I believe that each central bank should focus on its own domestic economic setting, and structure monetary policy to maintain each economy at full, sustainable employment with stable prices. Any other approach, I believe, while attractive to those that write newspaper stories, is not realistic. Finally, in the international sphere, I note that several countries have an explicit exchange rate mandate in monetary policy. In effect, they have given up a measure of independence and sovereignty in their monetary policies. It is not clear whether these countries have a better or worse record with respect to inflation and growth than countries that do not put such a heavy weight on exchange rates in policy decisions. I suspect it varies widely across countries, and additional research on the circumstances in which such regimes are successes and failures would be welcome. I believe that the approach we follow currently, which puts greater weight on observations and projections of the direct measures of economic performance in the United States, is the best for us. However, if other countries have achieved better results by focusing on indirect measures, such as exchange rates, we should be aware of that success. Conclusion As you can see, these are interesting times to be a central banker. The macroeconomic challenges evolved quickly during the course of the last twelve months. The economic outlook is for slowing, but still positive, growth, but in the context of contained inflation and tight labor markets. To manage this complex set of forces requires continued vigilance on our part. Additionally, the last twelve months have raised an important set of questions regarding measures of real economic performance, the behavior of inflation, financial market indicators, and the growing globalization of today’s economy. I have enjoyed immensely having to grapple with these issues, but recognize that we at the central bank do not have all the answers. As you consider you research agendas, I ask you to keep in mind the potential positive contribution you can make to the Nation’s welfare by focusing some of your efforts on these policy-relevant puzzles, and sharing your results with me and my colleagues. * * *
|
board of governors of the federal reserve system
| 1,999 | 1 |
Remarks by Mr Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System, delivered at the East Hanover Area Chamber of Commerce, East Hanover, New Jersey on 15/1/99.
|
Mr Ferguson expresses his views on monetary policy and the outlook for the US economy Remarks by Mr Roger W. Ferguson, Jr., a member of the Board of Governors of the US Federal Reserve System, delivered at the East Hanover Area Chamber of Commerce, East Hanover, New Jersey on 15/1/99. The Making of Monetary Policy Thank you for inviting me to share lunch and a few thoughts with you today. I believe that it is important for the actions of this country’s central bank to be as transparent as possible, and therefore I am happy to address this audience on the topic of what the Federal Reserve actually does and how it does it. I will also try to give an assessment of the national and international issues affecting Federal Reserve policy. I hope that you will conclude from this discussion that, though the process of setting monetary policy is complex, it is nonetheless in many important ways accessible to the average citizen. I also hope that you will agree that there are really very few secrets; we attempt to be as open as is responsible, which is appropriate in a democracy. History of the Federal Reserve System and the FOMC The Federal Reserve was created by an Act of Congress on December 23, 1913. The Federal Reserve System consists of a seven-member Board of Governors (an independent agency of the federal government with headquarters in Washington, DC), plus a nationwide network of 12 Federal Reserve Banks and 25 branches. Congress established the Federal Reserve Banks as the operating arms of the nation’s central banking system, and they have both public and private elements. Neither the Board nor the Reserve Banks receive appropriations from Congress. Therefore, they do not operate with tax revenues, but rather pay expenses out of earnings. Earnings of the Federal Reserve Banks are derived primarily from interest received on their holdings of U.S. government securities and the fees they charge to depository institutions for providing services. All of the net earnings of the Banks, after expenses, contributions to surplus and payment of other assessments, are aggregated and paid over to the U.S. Treasury. In 1998, for example, the Federal Reserve paid approximately $26.5 billion to the U.S. Treasury. The Federal Open Market Committee, or FOMC, is the most important monetary policy-making body of the Federal Reserve System. The FOMC makes the key decisions regarding the conduct of open market operations (purchases and sales of U.S. government securities) which affect the cost and availability of money and credit in the U.S. economy. The voting members of the Committee are the members of the Board of Governors and five Reserve Bank presidents. The president of the Federal Reserve Bank of New York serves on a continuous basis; the presidents of the other Reserve Banks serve one-year terms on a rotating basis, beginning on January 1 of each year. All the Reserve Bank presidents participate fully in the various discussions, regardless of whether they currently have a vote in the policy decision. By law, the FOMC must meet at least four times each year in Washington, DC. Since 1980, eight regularly scheduled meetings have been held each year. If circumstances require consultation or consideration of an action between regularly scheduled meetings, members may be called upon to participate in a special meeting or a telephone conference. –2– So what happens at these meetings? The order and structure of these meetings may change over time, but has been pretty much fixed during my term on the Board. Before each regularly scheduled meeting of the FOMC, System staff members prepare written reports on past and prospective economic and financial developments, which are sent to Committee participants. At the meeting itself, staff members present oral reports on the current and prospective business situation, conditions in financial markets and international economic and financial developments. After these reports, each Committee participant, voting and nonvoting, expresses his or her views on the current state of the economy and prospects for the future. After a short coffee break, we have a staff presentation on the alternatives we face in setting monetary policy. At that point, the Chairman gives his view of the economy and makes a suggestion for the appropriate direction of policy. Each Committee member then responds to the Chairman’s suggestion, in the process setting out a preferred choice for policy. After this second “go-around,” we take a formal vote on the target federal funds rate for the period until the next meeting. At this point, the focus is on the voting members, who, as I noted, include all of the Governors and five of the 12 Presidents. Generally, an announcement of a change in interest rates, if any, is made at about 2:15 in the afternoon. A full set of meeting minutes is made available after the subsequent meeting. Let me turn now to a discussion of monetary policy and the outlook for the U.S. economy. Of course, the views I shall be expressing are my own and are not necessarily shared by the FOMC or the Board of Governors. The Goals of Monetary Policy Federal law establishes the goals of monetary policy. The Federal Reserve and the FOMC are “to promote effectively the goals of maximum employment, stable prices, and moderate long-term interest rates”. Many analysts believe that achieving price stability should be the primary goal of a central bank because a stable level of prices appears to be the condition most conducive to maximum sustained output and employment, and to moderate long-term interest rates. This presumably is because in times of price stability the prices of goods, materials and services are undistorted by inflation and thus can serve as clearer signals and guides for the efficient allocation of resources. Also, a background of stable prices is thought to encourage capital formation because it reduces the distortion created by a tax system that is only partly indexed for inflation. Some would argue that the remarkable period of growth that we are experiencing is due in no small measure to the low rate of inflation that has prevailed for some time. The problem with the rather neat formulation that I have just given is that it does not include an element of time. In the long run, I have no doubt that price stability underpins efforts to achieve maximum output and employment. But, in the short run, some tensions can arise between efforts to reduce inflation and efforts to maximize employment and output. For example, the economy may at times be faced with adverse developments affecting supply, such as a bad agricultural harvest or a disruption in the supply of oil, which put upward pressure on prices and downward pressure on output and employment. In these circumstances, makers of monetary policy must decide the extent to which we should focus on defusing price pressures or on cushioning the loss of output and employment in the short run, in the context of our long-term objectives. At other times, policy-makers may be concerned that expectations of inflation will get built into decisions about wages and prices, become a self-fulfilling prophecy, and result in temporary losses of output and employment. Countering this threat of inflation with a more restrictive monetary policy could risk small losses of output and employment in the short run but might make it possible to avoid larger losses later should expectations of higher inflation become embedded in the economy. –3– The press tries to categorize FOMC members as “hawks” and “doves.” These labels are an effort to simplify, in fact oversimplify, the complex choices that each member of the FOMC must make in deciding how to trade off the risk that action, or inaction, on our part will lead to inflation heating up to unacceptable levels, as opposed to having an inadequate creation of jobs. But I believe that all members of the Committee recognize that the major contribution the Federal Reserve can make to higher standards of living over time is to promote price stability. It is generally thought that monetary policy takes many months to have most of its effect on growth and employment, while of course it has an immediate impact on financial markets. Because of the long time frame for effect on the real economy, I believe that it is important for policy to look ahead one to two years as it aims at promoting stable prices and a sustainable GDP growth path, along which the economy is at full potential. Given that we need to be forwardlooking, or “preemptive,” as it is often put, one potentially important element in making monetary policy is the forecast or likely outlook for economic growth, unemployment and the price level for the next year or two. Fortunately, as we make monetary policy we have the advantage of several forecasts. The staff of the Board of Governors, private sector economists and the staffs of the Federal Reserve Banks all make forecasts. Some individual Governors and Reserve Bank Presidents also have considerable experience and expertise in forecasting. To be useful for monetary policy, forecasts of the future health of the economy need to be reasonably reliable. Good forecasts rest on the identification of empirical regularities that can be confidently relied upon to provide guidance. Regrettably, some previously reliable empirical relations have not proven so of late. As one example, the failure of price inflation to pick up as labor markets have become tighter is not consistent with older empirical observations. Moreover, factors outside of economists’ models have provided some surprises, such as the Asian economic and financial turmoil and the collapse of oil prices. If forecast relationships are less certain, it becomes more challenging to be “preemptive.” In these circumstances, I believe, it is appropriate to put greater weight on incoming data to determine whether the stance of monetary policy should be changed. Some of the variables relevant in this regard are: 1) the level of unemployment and the rate of job creation; 2) the rate of change in wages or prices or early signs of emerging inflation; 3) the rate and composition of GDP growth (inventory, trade flows, consumption, investment, residential and commercial construction, etc.); and 4) international developments. Economists refer to these variables as being from the “real” side of the economy. Another set of variables to be considered in making monetary policy are financial variables, such as money supply, interest rates, exchange rates, credit flows and conditions in bank lending or in the debt and equity markets. The last two years, 1997 and 1998, gave an interesting case study of the interplay between these different variables and the degree of forward-looking behavior in the making of monetary policy. From March of 1997 through much of 1998, pleasant surprises in the performance of inflation and the general absence of early signs of inflation, and uncertainty about the relationship of inflation to changes in the level of production, kept the FOMC from tightening. We did not tighten despite the economy being beyond most estimates of its potential and despite many forecasts of rising inflation. Put another way, the FOMC could have “preemptively” tightened monetary policy, based on forecasts, but, recognizing the uncertainties about empirical relationships, chose not to do so. But in the fall of 1998, you may recall, there was concern regarding the performance of the debt markets, especially the bond market and market for commercial paper. These concerns were centered around the fact that corporate borrowers could no longer raise funds in the bond or –4– commercial paper markets at reasonable prices or, at some times and for some borrowers, at all. In this case, we were reasonably confident that constraints on the ability of corporations to borrow would eventually have a negative effect on their willingness to invest in productive capacity and therefore on their ability to provide goods and services and to create jobs. A related concern was that weakened foreign economies might have adverse consequences for future domestic activity. Under those circumstances, the FOMC thought it appropriate to ease to offset a likely significant impact on future domestic spending and growth. Finally, no discussion of monetary policy and financial markets would be complete without reference to the equity markets. I believe that the Fed cannot target specific levels in equity markets. However, equity markets have spillover effects into the real economy and hence send important signals to policy-makers. As you know, economists often speak of the “wealth effect,” and econometric modeling indicates that consumers tend to raise the level of their spending about 2 –to 4 percent of incremental wealth, after two or three years. Through the so-called wealth effect, equity valuations can and do have an effect on consumption and on macroeconomic performance. Additionally, equity markets are a source of investment capital, and valuations in the stock market are one determinant of the cost of capital for businesses. Therefore, equity prices have an influence on business fixed investment, along with consumption, the major drivers of our economy. Finally, equity markets are of interest to policymakers because we have a responsibility for macro-stability. We have seen in other economies that bubbles and busts in financial markets can create unsettled conditions that impair real economic activity. Therefore, while it would be incorrect to say that policy-makers target the equity markets or that market concerns “tie the hands” of the Fed, the markets are an important consideration in macroeconomic analysis. These are just some of the factors that might go into making monetary policy. The interesting part of the job is that the relative importance of these factors – forecasts, current macroeconomic conditions, financial market conditions and others – is often in flux. For those who seek to monitor our actions, the good news is that through a reading of FOMC announcements and minutes, speeches by Governors and Presidents, and interviews, an observer can get a pretty good handle on which variables are uppermost in each policy-maker’s mind at any given time. Often, however, many factors are relevant, and we cannot indicate precisely the relative weights the FOMC may be applying to them in making policy. After all, the economy is influenced by all of the factors I have outlined, and the FOMC’s emphasis on specific factors varies over time as economic conditions change. Outlook for 1999 With this general statement of the factors that go into setting monetary policy, let me turn to the outlook for 1999. Recent economic data have been stronger than many had expected. It now appears to many forecasters that real GDP growth for 1998 was 3.5 percent or more on a fourthquarter-to-fourth-quarter basis. The consensus, as represented by the most recent Blue Chip Economic Indicators, is that real growth in 1999 will moderate, perhaps to slightly over 2 percent, again on a fourth-quarter-to-fourth-quarter basis (nearly 2.5 percent on a year-over-year basis). This consensus outlook, if accurate, would eventually produce a “soft landing”, with growth near the economy’s potential and inflation remaining low. The range of forecasts that compose this so-called consensus is large, perhaps indicating that forecasters are cognizant of the fact that it’s not hard to imagine risks to both sides of this scenario. I should also note that this is a repeat of earlier forecasts that growth will moderate, and the economy has surprised many forecasters with its resilience. I can see that 1999 will require a continued high level of vigilance for policy-makers. –5– Not surprisingly, many of the forces and uncertainties that seemed to shape so much of last year’s discussion are present in the outlook. Tight labor markets are putting pressures on wages, but competitive markets are limiting pricing leverage and causing profits to be squeezed. Under these circumstances, will businesses become more cautious in their spending and hiring plans? Rising stock prices continue to point to strong consumer demand, but valuations are high by historical standards and one wonders whether this stimulus will remain so strong. The downturns in some of the troubled economies of Asia seem to be bottoming out, at least outside of Japan, but the risk of spreading distress in Latin America creates another element of international uncertainty. Recent events in Brazil have made this concern more evident. It is now even more important that Brazil move as quickly as possible to implement a clearly sustainable fiscal position and regain the confidence of international markets and investors. The Federal Reserve will continue to monitor closely international developments and their potential effect on domestic activity. At the same time, U.S. inflation has been contained, at least until recently, by a number of factors, such as a rising exchange rate for the U.S. dollar through much of last year, wellcontained health care costs, and declining prices for oil and other commodities, which may prove to be short-lived. The ability of businesses to pass on price increases has been constrained, in part, by international competition. Will the restraint on pricing power remain as strong or weaken? If the latter, at what pace? Will the special factors I mentioned above continue to mitigate inflationary pressures arising from labor markets? Are there other, longer-term forces at work damping price pressures that we have not yet identified? A new factor that I find most interesting as we start this year is the likely impact of the upcoming century date change. As you may know, I am Chairman of the Joint Year 2000 Council, which is a group of financial regulators that has spent the last nine months focusing on the Year 2000 computer problem. In an international context, the Year 2000 is likely to have differing impacts across different regions. Here in the United States, my colleague Governor Mike Kelley has stated that we are likely to see some disruptions to economic activity because of Year 2000 problems but the effects are likely to be temporary and quickly reversed. This outcome also seems likely to me. Overseas, the early signs are that Europe has successfully converted its computer systems to the euro, but there were some periods early in the transition when the settlement of some cross-border transactions in Europe did experience end-of-day glitches. It is now important for senior management of major European institutions to turn their attention to preparations for the Year 2000. Asian financial and other business firms, too, should focus energy on preparing for the Year 2000 even as they are restoring fundamental financial soundness. Indeed, the events of the last 18 months have probably distracted their attention from this problem. It is in everyone’s interest that we not become complacent as we enter these last 12 months before the start of the new millennium. Conclusion The Federal Reserve has an important role to play in our economy. However, our role in the economy should be kept in perspective. We control just a few of the levers that drive the economy. We observe consumption, investment and labor market decisions and try to adjust monetary policy to the signals that we receive so that our nation’s economic welfare is not threatened by inflation or by growth that is below the economy’s potential. However, it is the decisions and actions that you take as consumers and business managers that ultimately determine the health of the United States economy. Despite difficult periods, 1998 turned out to –6– be a very good year because of a mix of private action and economic policy. I hope that 1999 will be as good to us all. ***
|
board of governors of the federal reserve system
| 1,999 | 1 |
Testimony of the Chairman of the Board of Governors of the US Federal Reserve System, Mr Alan Greenspan, before the Committee on Ways and Means of the US House of Representatives on 20/1/99.
|
Mr Greenspan testifies on the state of the US economy Testimony of the Chairman of the Board of Governors of the US Federal Reserve System, Mr Alan Greenspan, before the Committee on Ways and Means of the US House of Representatives on 20/1/99. The American economy through year-end continued to perform in an outstanding manner. Economic growth remained solid, and financial markets, after freezing up temporarily following the Russian default, are again channeling an ample flow of capital to businesses and households. Labor markets have remained quite tight, but, to date, this has failed to ignite the inflationary pressures that many had feared. To be sure, there is decided softness in a number of manufacturing industries as weakness in many foreign economies has reduced demand for US exports and intensified competition from imports. Moreover, underutilized production capacity and pressure on domestic profit margins, especially among manufacturers, are likely to rein in the rapid growth of new capital investment. With corporations already relying increasingly on borrowing to finance capital investment, any evidence of a marked slowing in corporate cash flow is likely to induce a relatively prompt review of capital budgets. The situation in Brazil and its potential for spilling over to reduce demand in other emerging market economies also constitute a possible source of downside risk for demand in the United States. So far, markets seem to have reacted reasonably well to the decisions by the Brazilian authorities to float their currency and redouble efforts at fiscal discipline. But follow through in reducing budget imbalances and in containing the effects on inflation of the drop in value of the currency will be needed to bolster confidence and to limit the potential for contagion to the financial markets and economies of Brazil’s important trading partners, including the United States. While there are risks going forward, to date domestic demand, and hence employment and output in the United States, certainly has remained vigorous. Though the pace of economic expansion is widely expected to moderate as 1999 unfolds, signs of an appreciable slowdown as yet remain scant. But to assess the economic outlook properly, we need to reach beyond the mere description of America’s sparkling economic performance of eight years of record peacetime expansion to seek a deeper understanding of the forces that have produced it. I want to take a few moments this morning to discuss one key element behind our current prosperity: the rise in the value markets place on the capital assets of US businesses. Lower inflation, greater competitiveness, and the flexibility and adaptability of our businesses have enabled them to take advantage of a rapid pace of technological change to make our capital stock more productive and profitable. I will argue that the process of recognizing this greater value has produced capital gains in equity markets that have lowered the cost of investment in new plant and equipment and spurred consumption. But, while asset values are very important to the economy and so must be carefully monitored and assessed by the Federal Reserve, they are not themselves a target of monetary policy. We need to react to changes in financial markets, as we did this fall, but our objective is the maximum sustainable growth of the US economy, not particular levels of asset prices. As I have testified before the Congress many times, I believe, at root, the remarkable generation of capital gains of recent years has resulted from the dramatic fall in inflation expectations and –2– associated risk premiums, and broad advances in a wide variety of technologies that produced critical synergies in the 1990s. Capital investment, especially in high-tech equipment, has accelerated dramatically since 1993, presumably reflecting a perception on the part of businesses that the application of these emerging technological synergies would engender a significant increase in rates of return on new investment. Indeed, some calculations support that perception. They suggest that the rate of return on capital facilities put in place during recent years has, in fact, moved up markedly. In part this may result from improved capital productivity – that is, the efficiency of the capital stock. In addition, we may be witnessing some payoffs from improved organizational and managerial efficiencies of US businesses and from the greater education – in school and on the job – that US workers have acquired to keep pace with the new technology. All these factors have been reflected in an acceleration of labor productivity growth. Parenthetically, improved productivity probably explains why the American economy has done so well despite our oft-cited subnormal national saving rate. The profitability of investment here has attracted saving from abroad, an attraction that has enabled us to finance a current account deficit while maintaining a strong dollar. Clearly, we use both domestic saving and imported financial capital in a highly efficient manner, apparently more efficiently than many, if not most, other major industrial countries. While discussions of consumer spending often continue to emphasize current income from labor and capital as the prime sources of funds, during the 1990s, capital gains, which reflect the valuation of expected future incomes, have taken on a more prominent role in driving our economy. The steep uptrend in asset values of recent years has had important effects on virtually all areas of our economy, but perhaps most significantly on household behavior. It can be seen most clearly in the measured personal saving rate, which has declined from almost 6% in 1992 to effectively zero today. Arguably, the average household does not perceive that its saving has fallen off since 1992. In fact, the net worth of the average household has increased by nearly 50% since the end of 1992, well in excess of the gains of the previous six years. Households have been accumulating resources for retirement or for a rainy day, despite very low measured saving rates. The resolution of this seeming dilemma illustrates the growing role of rising asset values in supporting personal consumption expenditures in recent years. It also illustrates the importance when interpreting our official statistics of taking account of how they deal with changes in asset values. With regard first to the statistical issues, capital gains themselves are not counted as income, but some transactions resulting from capital gains reduce disposable household income as we measure it, while having no effect on consumption. As a consequence, as capital gains and these associated transactions mount, published saving rates are decreased. For example, reported personal income is reduced when corporations cut back payments into defined-benefit pension plans owing to higher equity prices; however, such reductions do not diminish anticipated retirement income and thus should not lower consumption. And reported disposable income is decreased when households pay taxes on capital gains realizations that would not have been so –3– large in less ebullient markets. However, capital gains tax payments also are highly unlikely to be associated with lower spending because the cash realized from the sale of the asset exceeds the tax, and in most cases the typical household presumably does not perceive this transaction as reducing available income or financial resources. Together these two effects probably account for an appreciable portion of the reduction in the reported saving rate. But beyond these statistical issues, there is little doubt that capital gains have increased consumption relative to income from current production over recent years. Economists have long recognized a “wealth effect” – a tendency for consumption to rise by a fraction of the capital gains on existing assets owned by households – though the magnitude of this effect remains difficult to estimate accurately. We have some evidence from recent years that all or most of the decline in the saving rate is accounted for by the upper income quintile where the capital gains have disproportionately accrued, which suggests that the wealth effect has been real and significant. Thus, all else equal, a flattening of stock prices would likely slow the growth of spending, and a decline in equity values, especially a severe one, could lead to a considerable weakening of consumer demand. Some moderation in economic growth, however, might be required to sustain the expansion. Through the end of 1998, the economy continued to grow more rapidly than can be currently accommodated on an ongoing basis, even with higher, technology-driven productivity growth. Growth has continued to shrink the pool of workers willing to work but without jobs. While higher productivity has helped to keep labor cost increases in check, it cannot be expected to do so indefinitely in ever tighter labor markets. Despite brisk demand and improved productivity growth, corporate profits have sagged over recent quarters. This is attributable in part to some acceleration in labor compensation, but other factors have also been pressing, especially intensified competition and lower prices facing our exporters and those industries competing with imports. In these circumstances, businesses will feel under considerable pressure to preserve profit margins should labor costs accelerate further, or should the falling prices of commodity inputs, like oil, turn around. But, to date, businesses’ evident pricing power has been scant. Either that would change and inflation could begin to mount or, if costs could not be recouped, capital outlays might well be cut back. The recent behavior of profits also underlines the unusual nature of the rebound in equity prices and the possibility that the recent performance of the equity markets will have difficulty in being sustained. The level of equity prices would appear to envision substantially greater growth of profits than has been experienced of late. Moreover, the impressive capital gains of recent years would seem also to rest on a perception of relatively low risk in corporate ownership. Risk aversion and uncertainty rose sharply over the late summer and fall of 1998 following the Russian default in mid-August, as evidenced by widening spreads among yields on debt of differing credit qualities and liquidity. The rise in uncertainty increased the discounting of claims on future incomes, and that reduced stock market prices even as the long-term earnings growth expectations of security analysts continued to rise. As risk aversion subsided after mid-October, stock prices returned to record levels. Markets have doubtless stabilized significantly after the turbulence of last fall but they remain fragile, as the repercussions of the recent Brazilian devaluation attest. Moreover, our chronic current account deficit has widened significantly, in part reflecting the strength of domestic demand that has accompanied the further accumulation of capital gains. The continued increase in our net external debt and its growing servicing costs clearly are not sustainable indefinitely. –4– In light of the importance of financial markets in the economy, and of the volatility and vulnerability in financial asset prices more generally, policymakers must continue to pay particular attention to these markets. The Federal Reserve’s easing last fall responded to an abrupt stringency in financial markets and the effects that the consequent increased risk aversion was likely to have on economic activity going forward. We were particularly concerned about higher costs and disrupted financing in debt markets, where much of consumption and investment is funded. We were not attempting to prop up equity prices, nor did we plan to continue to ease rates until equity prices recovered, as some have erroneously inferred. This has not been, and is not now, our policy or intent. As I have discussed earlier, movements in equity prices can play an important role in the economy, which the central bank must take into account. And we may question from time to time whether asset prices may not embody a more optimistic outlook than seems reasonable, or what the consequences might be of a further rise in those prices followed by a steep decline. But many other forces also drive our economy, and it is the performance of the entire economy that forms our objectives and shapes our actions. Nonetheless, in the current state of financial markets, policymakers are going to have to be particularly wary of actions that unnecessarily sow uncertainties, undermine confidence, and interfere with the efficient allocation of capital on which our economic prosperity and asset values rest. It is important not to undermine the highly sensitive ongoing process of reallocation of capital from less to more productive uses. For productivity and standards of living to grow, not only must capital raised in markets be allocated efficiently, but internal cash flow, including the depreciation charges from the existing capital stock, must be continuously directed to their most profitable uses. It is this continuous churning, this so-called creative destruction, that has become so essential to the effective deployment of advanced technologies by this country over recent decades. In this regard, drift toward protectionist trade policies, which are always so difficult to reverse, is a much greater threat than is generally understood. It is well known that erecting barriers to the free flow of goods and services across national borders undermines the division of labor and standards of living by impeding the adjustment of the capital stock to its most productive uses. Not so well understood, in my judgment, is the impact that fear of growing protectionism would have on profit expectations, and hence on the current values of capital assets. Protectionism was a threat to standards of living when capital asset values were low relative to income. It becomes particularly pernicious in an environment, such as today’s, when that is no longer the case. In sum, it has been the ability of our flexible and innovative businesses and workforce that has enabled the United States to take full advantage of emerging technologies to produce greater growth and higher asset values. Policy has facilitated this process by containing inflation and by promoting competitiveness through deregulation and an open global trading system. Our task going forward – at the Federal Reserve as well as in the Congress and Administration – is to sustain and strengthen these policies, which in turn have sustained and strengthened our now record peacetime economic expansion. ***
|
board of governors of the federal reserve system
| 1,999 | 1 |
Remarks by Mr Roger W Ferguson, Jr, a member of the Board of Governors of the US Federal Reserve System and Chairman of the Joint Year 2000 Council, at the Bank Administration Institute, Orlando, Florida on 21/1/99.
|
Mr Ferguson remarks on the international millennium challenge Remarks by Mr Roger W Ferguson, Jr, a member of the Board of Governors of the US Federal Reserve System and Chairman of the Joint Year 2000 Council, at the Bank Administration Institute, Orlando, Florida on 21/1/99. Thank you for your invitation to speak today on the timely and critical topic of the Year 2000 challenge. This meeting reinforces the internationally recognized leadership role of the financial services industry in promoting Year 2000 preparedness. This morning I would like to discuss the global challenge presented by the Year 2000 problem and describe international efforts, particularly those of the Joint Year 2000 Council, to assist the worldwide financial services industry to be ready for the century date change. Of course, the obligation to fix the Year 2000 problem belongs ultimately to you in the private sector. Therefore, I will also discuss the important role the financial services industry can play to further mitigate risk, to develop sound contingency plans, and to serve as a catalyst in promoting best practices among your suppliers and customers. Global Challenge The impact of the Year 2000 date change on computer systems in the financial services industry presents a significant global challenge. The risks of some disruption to international trade and financial markets are not to be discounted completely, and the consequences of inaction are grave. The world economy functions through its financial services markets, and it is not a surprise, therefore, that the Year 2000 preparedness of the financial services industry is of great interest. Widespread and long-lasting disruptions, should they occur, could seriously impair commercial activities in countries where economies are already weakened. Moreover, and possibly the most onerous consequence, if market participants were to lose confidence, many countries, including the United States, may be faced with unusual liquidity demands. The good news is that we are not faced with inaction. I find that the seriousness of the Year 2000 problem is gaining increased national and international attention. Still, with less than a year left before the millennium change, it remains difficult to judge its global impact. Given the sheer number of computers and chip-based systems, and the manual nature of the fixes that have emerged to date, we are likely to experience some degree of disruption, which I believe most likely will prove to be mild and short-lived. I do not think that we will face global recession. Ultimately I believe the number and extent of computer system disruptions will not be used as the measure of our success in addressing the Year 2000 problem. Instead, our success will be measured by our ability, and the public’s confidence in our ability, to conduct business operations effectively on January 3, 2000, and thereafter. Current State of International Year 2000 Some foreign governments and institutions started their Year 2000 program years ago, while others were initially reluctant to recognize that a problem even existed. This reluctance hindered efforts to take early action. In some areas, already limited resources were dedicated to other activities, such as the euro currency conversion or attempts to restructure weakened economies. Additionally, those countries that recognized the problem early and had sufficient resources assembled technical experts from whatever source was available. Recognizing these challenges, in April 1998, the Bank for International Settlements hosted a Year 2000 roundtable to raise international awareness. At the close of the conference, the sponsoring organizations, the Basle Committee on Banking Supervision, the Committee on –2– Payment and Settlement Systems, the International Association of Insurance Supervisors, and the International Organization of Securities Commissions, established the Joint Year 2000 Council. The Council was formed as a direct result of the recognition of the complexity of the global financial services industry and the need to communicate proactively with regulatory and supervisory authorities. The Group of Seven has welcomed and endorsed the establishment of the Council and has requested it to develop the necessary initiatives to raise the Year 2000 issue to the attention of senior financial regulators. As Chairman, I work with other members of the Council to ensure that a high level of attention is given to the challenge within the global financial supervisory community and to serve as a point of contact with national and international private sector initiatives. To that end, the Council operates with the participation of central banks, insurance and securities regulators, and banking supervisors. It is the first international body that brings together such a range of financial regulators. It is important to note that the Joint Year 2000 Council is not the “international Year 2000 enforcement agency”. We do not have onsite examiners, nor are we meant to replace the efforts of national regulators. The Council established an External Consultative Committee, or ECC, to enhance the degree of information sharing between the public and private sectors. This committee includes representatives of internationally oriented organizations including the Global 2000 Coordinating Group, the International Monetary Fund, the World Bank, and financial service providers, such as S.W.I.F.T., Euroclear and Cedel. The Council brings together an unprecedented number of regulators and supervisors and, with the input of the ECC, it has developed a common dialogue within the global regulatory community. To fulfill its mission, the Council has proposed and executed a number of initiatives. Our most important role may be to provide a forum to facilitate information sharing and cooperation among supervisors. To accomplish this, we have developed a global supervisory contact list of over 600 financial regulators, and initiated several mechanisms for communicating with them. Our most visible may be a series of bulletins, which come out monthly, on different themes and topics. Additionally, the Council has been supporting, co-sponsoring and providing assistance in planning conferences and roundtables on the Year 2000 challenge and will continue to do so. To date, we have conducted meetings for regulators from Europe, Asia-Pacific, North and South America and the Caribbean, and the Middle East. Tomorrow, the Council will sponsor a meeting for the African region in Pretoria. These conferences have provided an excellent means of bringing supervisors together to discuss common interests within specific geographic areas. When we first started, some supervisory authorities had not raised the Year 2000 problem to a very high level of attention. Much of this delay could be attributed to resource demands and the lack of a sufficient knowledge base on Year 2000 issues. To address this information shortfall, the Council determined that the supervisory community would benefit from written guidance papers that address key phases in the Year 2000 process and specific responsibilities of supervisors and regulators. These statements would serve to raise awareness and share best practices. To date we have issued four of these papers, on: 1) the independent assessment of financial institution preparedness, issued in June 1998, 2) the origin, scope, and seriousness of the problem and the potential impact on the financial services industry, 3) testing for Year 2000 readiness, and 4) suggested methods for information sharing among members of the financial services industry, its third-party providers, and supervisory authorities. In an effort to encourage disclosure and transparency, the Council has also developed a program to collect and publish key Year 2000 information that will emphasize what initiatives are underway throughout the world. The Council has asked public sector organizations in more than 170 countries to provide information on national governments, central banks, financial market supervisors, operators of exchanges and payment settlement and trading systems, financial sector industry associations, and major utilities. The information is posted on our web-site on a countryby-country basis. What is the State of International Readiness? What is the state of “international readiness”? As you understand, it is difficult to measure accurately the level of international Year 2000 readiness, and certainly no one can predict with confidence exactly how the century date change will unfold internationally. There are three reasons for this difficulty. First, there is no single indicator that can be used to judge the overall Year 2000 readiness of any country, including the United States. Second, the state of readiness is a moving target. Judgements are usually based on anecdotal information obtained either firsthand through interviews or surveys or second-hand through other sources. The best information is subject to change. Finally, while it may be possible to reach conclusions about the Year 2000 readiness of individual industry sectors, it remains difficult to assess the interdependency of critical systems across industry sectors. In addition, the state of Year 2000 readiness of a country’s public sector may not be an accurate indicator of the status of its private sector. With that said, let me briefly summarize my own observations on international Year 2000 readiness of the financial sector. I do not have information on all countries, and cannot discuss them in great detail. I do believe that, as with the United States, in most countries the financial sector was probably somewhat ahead of other sectors in recognizing the Year 2000 problem, and is probably somewhat more advanced in remediation. I am optimistic that the western European banking sector, in general, will continue its hard work to be Year 2000 ready. I have spoken with a number of western European financial market participants about the Year 2000 issue, and each of them has stressed the commitment of his or her organization to do everything possible to limit any Year-2000-related disruption. Although the conversion of computer systems to accommodate the euro was a high priority for most western European banks, the European Commission indicated in a report last December that the financial sector was “exemplary” in its progress and the level of coordination taking place. The experience gained by the banks in testing with financial service counterparts for the euro conversion will, I hope, provide efficiencies for comparable Year 2000 testing. I understand that there is considerable progress in Japan’s Year 2000 readiness. In April of last year, it was unclear whether the Japanese recognized Year 2000 as an industry-wide problem. In addition, it was difficult to determine whether the public or private sector had developed clear plans to ensure Year 2000 readiness. During the past few months, however, I have read reports from rating agencies indicating that Japanese banks seem to be making progress and that the results of their conversations with Japanese banks have been “somewhat encouraging”. The Japanese regulators are scrutinizing their banks closely, and I understand that the Bank of Japan is conducting four industry-wide domestic tests between now and June of this year. In addition to this information about these markets, I can report that, through the regional meetings of regulators that I referred to above, we have now had contact with regulators from over 100 countries, and the degree of awareness among regulators is, I find, uniformly high. This is a good underpinning for them to ask the proper questions of the banks they oversee, and to assist those institutions in making progress. –4– Finally, we should be interested in the preparations of the infrastructure providers that support the global financial system. Central banks, including the Federal Reserve, have met with S.W.I.F.T. and other international infrastructure providers such as CHIPS, Euroclear and Cedel Bank to discuss their plans and progress on internal readiness. S.W.I.F.T. cooperates closely with the National Bank of Belgium in communicating the status of that entity’s Year 2000 program. These various entities are heavily focused on the Year 2000 problem, are participating extensively in Year 2000 testing, and appear to be working quite cooperatively among themselves and with their customers to resolve this problem. In this regard, let me discuss briefly the linkage of international and domestic payment systems. We believe that the operational procedures that exist today among the Federal Reserve’s domestic payment system, Fedwire, and international payment systems will probably act as firewalls to prevent the risk of contagion and limit financial exposure from international Year 2000 difficulties, if any should emerge. First, as you well know, the Year 2000 “bug” is not a virus that passes from one software program to another. The concern, rather, is whether a significant amount of erroneous payments could be transmitted electronically. There are procedures and checks in the handling of international payment orders that minimize the likelihood of financial loss from erroneous payments being made through these electronic systems. These checks include: 1) the monitoring of account status done by the Federal Reserve, by other payment mechanisms, notably CHIPS, and by internationally active commercial banks, and 2) counterparty exposure limits imposed on and by each of these players. Finally, these infrastructure providers are in communication to discuss contingency plans. What Should We All Be Doing This Year? While the Council recognizes that, in many countries, the financial industry is ahead of other sectors in its Year 2000 preparations, uncertainties will remain with regard to the readiness of systems both internal and external to all organizations. It is the responsibility of every individual private sector firm to resolve its Year 2000 problems. Public sector policymakers and regulators have an important role to play in fostering appropriate actions by market participants and working to achieve the proper coordination between the various authorities involved, within the financial sector as well as at the national level. During these past eight months, we have identified four critical themes common to successful Year 2000 programs in both the private and public sectors. First, while much productive work has been carried out around the world to adapt systems that need to be remediated, we need to avoid complacency and continue to maintain a high degree of attention at the most senior level. Senior management and the boards of directors in the private sector must continue to allocate sufficient human and financial resources. Senior executives should be fully aware of their firm’s dependence on third-party vendors, service providers, customers, counterparties, and public infrastructures and also of the subsequent risks these dependencies pose. The public sector should encourage financial market participants to continue to devote the maximum senior management attention and priority to this critical issue. Second, enhanced information sharing is critical to readiness efforts and to avoid unnecessary uncertainty in financial markets. In many countries, there is currently a lack of adequate information on Year 2000 readiness. This is of concern because it impedes efficient preparations by market participants and may exacerbate negative perceptions in the marketplace. Financial institutions should share information in order to strengthen confidence that the Year 2000 challenges are being met in all financial sectors worldwide. In some countries, regulators have mandated public disclosure, such as requiring regular report filings that are made public. Regulators can also play a constructive role in making sure that the Year 2000 information that they disseminate to the public is factually accurate, balanced and broadly disseminated. Third, comprehensive contingency strategies should be developed in order to minimize disruptions. It is important that individual firms review their current business continuity plans and evaluate them in light of possible problems arising from the Year 2000 transition. Regulators should develop and implement appropriate contingency plans in order to prepare for possible disturbances relating to the millennium transition. The Council plans to publish in the near future policy papers on contingency planning for individual financial institutions and for individual regulators. Government agencies should encourage telecommunication, utility, and other infrastructure service providers to share and coordinate contingency plans. Fourth, modifications resulting from system or regulatory changes that have the potential to disrupt Year 2000 programs should be limited. It is important that firms carefully manage the risks associated with making changes to their internal information systems in 1999 and early 2000. Firms should avoid introducing new IT applications that could generate changes to critical business applications that have been renovated, tested and certified. The Federal Reserve in October 1998 implemented an internal change management policy that restricts system changes to those deemed absolutely critical. Similarly, legislatures and regulators should limit legal and regulatory changes that might require resources to be diverted from a financial organization’s Year 2000 program. Before issuing a legal or regulatory change, regulators should consider whether or not the action could be delayed until after 2000. The Public’s Obligations with Respect to Year 2000 Finally, all of us as members of the public have obligations with respect to Year 2000. The first is to maintain perspective and rely on common sense. As we get closer to the century date change, there will no doubt be more sensational coverage in the media. Our obligation, as always, is to be smart consumers of information and to listen to responsible, not alarmist, voices. Remember, as with anything that has a degree of uncertainty, there will always be those who predict the most dire outcomes. They have generally been wrong in the past, and I expect that they will be wrong again. Our second obligation is to maintain reasonable and responsible patterns of behavior. There are likely to be some disruptions from the century date change; nothing this complex can be perfectly faultless. However, we should remember that there have been serious disruptions to service in daily life before, from storms, temporary electrical outages, disruptions of telephone service, etc. In general, these prove to be annoying and inconvenient, but nothing more. Finally, we should all recognize that most systems are built to withstand reasonable service demands. If we stress them by changing normal usage patterns, we may experience delays, not because of Year 2000 problems, but rather due to capacity overload. Conclusion I believe that the financial services industry has made great progress in addressing Year 2000 issues. I am cautiously optimistic that the United States and the global economy will weather the century date change without major disruptions to economic activity. During this next year you will not only need to do your best to continue to repair and test your own systems, but will also need to evaluate the risk of potential failures and the effect of these failures on your business. Further, financial institutions will need to test fallback procedures or work-around processes that mitigate the effect of such failures on their ability to continue to conduct business. –6– If we accept the Year 2000 problem as a worldwide problem, then we must focus our attentions broadly. Whether the extent of your business resides only within one community or extends internationally, some portion of every firm’s Year 2000 efforts must be coordinated and cooperative. This conference is an example of such coordination and cooperation. Finally, we must continue to be a reliable source of accurate and sound information to maintain the public’s trust. I do not doubt that we can collectively rise to these challenges. * * *
|
board of governors of the federal reserve system
| 1,999 | 1 |
Speech by Mr Roger W Ferguson, Jr, member of the Board of Governors of the US Federal Reserve System, to the Florida International Bankers Association in Miami on 11/02/99.
|
Mr Ferguson gives a 20-year overview of economic developments in Latin America Speech by Mr Roger W Ferguson, Jr, member of the Board of Governors of the US Federal Reserve System, to the Florida International Bankers Association in Miami on 11/02/99. Thank you very much for inviting me to join you this evening here in Miami. It is a pleasure to become a member of the roster of Federal Reserve Governors who have addressed this joint meeting of the Florida International Bankers Association and the Miami Bond Club. My theme this evening will mirror the fact that international banking has grown significantly in Florida in the more than 20 years that have elapsed since the passage of the Florida International Banking Act. Many of you in the audience represent institutions that are active in trade finance throughout the hemisphere, and there is a natural strong link between Florida and Latin America. Similarly, about 20 years ago I studied Latin American economic development with Albert O Hirschman, one of the great development economists of our time. Therefore, the topic I would like to discuss this evening is what has worked, and apparently has not worked, in Latin America in the last 20 years or so. The main message is that, after the 1980s debt crisis, many Latin American countries embarked on a path of greater economic freedom, lessened government intervention in markets and sounder policies. Those policies provided the basis for heightened economic dynamism and growth, and thereby led to significant benefits for their citizens, including higher incomes, lower inflation, and a wider range of economic opportunities. In view of my audience today, I should also mention that, partly as a consequence of improved and more market-friendly policies, many of Latin America’s banking systems are endeavoring to become more efficient, more stable, and better integrated with global financial markets. As a result, Latin American banks should become better able to support the process of stable, non-inflationary growth in the future. The current economic and financial problems in Brazil should, if anything, reinforce the importance of pursuing sound fiscal and monetary policies and improvements in the underlying institutions that support economic activity. Background Twenty years ago, or by the end of the 1970s, Latin America was on the verge of moving from a phase of unsustainable economic activity – based on high domestic consumption, heavy borrowing from abroad, unsustainable currency levels, and excessive intervention by government in the economy – into a decade of lost growth and lost opportunities known as the Debt Crisis. As the 1980s began, Latin American countries saw prices of their exports plunge, interest rates skyrocket, and access to international capital being cut back severely. In this harsh new environment, the shortcomings of previous policies became even more apparent, and economic performance faltered. Currency values plummeted as governments ran out of reserves, inflation soared, in some cases to as much as triple digit levels, and the output of the region contracted severely. Key social and developmental priorities had to be scaled back as governments struggled to finance their budgets and find the funds to repay foreign creditors. Domestic financial systems were thrown into disarray and many banks were severely weakened, both in direct response to the shocks hitting the region, and also as a result of misguided government policies undertaken in response to economic and political problems. The main discussions regarding Latin America seemed most focused on issues of hyperinflation, high unemployment, and repaying the external debt. Not surprisingly, under these circumstances, job creation during the 1980s was slow, many domestic businesses did not invest, and direct foreign investment was weak as well. These problems were not merely the result of uncontrollable, external shocks. They also reflected the fact that many governments in the region were slow to enact the reforms that would lay the foundation for future growth. In fact, in many cases governments responded to problems with policies that made things –2– worse, not better, including wage and price controls, freezes on domestic interest rates, and adoption of protectionist trade policies. These latter were in some ways particularly problematic, in that Latin American countries in general did not focus on growing out of their problems through expansion of foreign trade. Trade flows during the period were relatively small, compared with what one might expect from economies of their size, and the trade that did occur was highly tilted toward raw materials and semi-processed industrial products. In sum, notwithstanding some bright spots, the tone of discussion regarding Latin America seemed biased toward discussion of problems and not of opportunities. Current conditions I cannot say that all of the problems of the late 1970s and 1980s have evaporated. Indeed, as is illustrated by the current situation in Brazil, there are clearly problems still to be resolved. It continues to be important for Brazil to implement macroeconomic and structural policies that restore international confidence and also reassure citizens that inflation will not reignite. However, much does seem to have changed with respect to Latin America during the intervening period, and the region as a whole, even recognizing the ongoing challenges of the current situation, appears to be closer to achieving the promise that it showed a generation ago. Most of the countries in the region have experienced growth since the 1980s. Indeed, the growth rate of Latin American countries has averaged about 3½% annually during this decade. In addition, the scourge of inflation seems, in general, to have been brought under control. The costs were high, but the battle seems mainly to have been won. Admittedly, the worst inflation rates, at about 40%, are still high by the standards in the United States. However, they are much reduced from those that existed 20 years or even 10 years ago. Last year, Argentina’s inflation rate was under 1%. That is down from nearly 5000% in 1989! Knowing that many of your institutions are involved in trade finance, you know that the trade outlook for the region has changed significantly from what it was a generation ago. The annual total merchandise trade of Latin American countries has grown from $130 billion in 1978 to $522 billion in 1997. Even adjusted for inflation, this represents nearly a doubling of its trade, allowing the growth of Latin American imports and exports to roughly match the explosion in international trade experienced throughout the world economy during this period. Moreover, in many Latin American countries, exports have grown substantially as a proportion of total production, and the fraction of these exports accounted for by manufactured goods has also increased. Finally, Latin America has become an active participant in the flow of foreign direct investment and other forms of capital. In 1990, as the Debt Crisis was winding up, net direct investment flows to Latin America totaled only $7 billion; by 1997, this figure had risen to $51 billion. Portfolio capital flows also have grown very substantially. Although they have been more volatile as well, net portfolio investment rose from $18 billion in 1990 to a peak of about $61 billion in 1993 and 1994, and were $34 billion in 1997 in the wake of the Asian crisis. Obviously, the current problems indicate that participation in the global capital markets requires policy discipline and sound institutional structures, which many countries have put in place and others must continue to develop. Causes and lessons As with any macroeconomic performance, there are many causes for the general improvement that much of Latin America experienced since the 1980s. The most general statement is that many of the economies of Latin America have adopted more prudent and open policies that foster competition and participation in global markets. The adoption of these improved policies did not come out of thin air. They were born out of the disastrous economic performance of the region during the 1980s, and –3– the realization that only dramatic improvements in economic policies would suffice to allow growth and prosperity to take hold. First, there has been a general acknowledgment, and recent reinforcement of the lesson, that prudent fiscal policy is crucial to economic stability, and several of the economies of the region have made significant strides in this direction. While much is left to do in this sphere, it does appear that many countries have learned the benefits of prudent fiscal policy. Fiscal deficits in most countries in the region have dropped, and in cases where they have not yet declined, serious attention toward achieving improvements is being mustered. Improvements in budget balances have not been achieved without pain. In many countries government payrolls have been slashed, generous transfer and subsidy programs have been cut, and social expenditures have had to be reduced. On the other hand, the realization has become widespread that in the absence of such budget-cutting measures, employment and wages would become even more depressed. Moreover, people have found that many of the actions taken to cut budgets, including privatization of money-losing state enterprises, has led to better service and an improved quality of life. Second, there has been a substantial dismantling of the controls imposed by government over private sector prices and wages. It is now widely understood that excessive budget deficits and money creation are the root causes of inflation, and controls over private prices have been generally abolished. Governments also have largely scaled back their role in private wage negotiations as well. Public controls on privately contracted interest rates and other financial market prices also have been eliminated for the most part. Finally, exchange rates in many Latin American countries have become more flexible, and where the government still intervenes in foreign exchange markets, there is a greater understanding that such intervention must be underpinned by appropriate fiscal and monetary policies. Third, the role of the government in other aspects of the economy has been substantially diminished, with an accordingly greater scope allowed for private activity and for competition from abroad. Internally, many countries have privatized major businesses. The privatization of Telebras, the Brazilian telephone company, is a prominent recent example, and privatization has proceeded even further in other Latin American countries. Such privatization has in many cases been complemented with legislation opening particular sectors to greater and fairer competition. Additionally, trade barriers have been removed in many countries, often through participation in regional agreements and broader international arrangements. The trade barriers that have been dismantled include both tariffs and non-tariff barriers. In consequence, the share of foreign trade in economic activity has increased substantially in certain cases. Closely associated with increased participation in international trade has been increased participation in international capital markets, made possible in part through substantial liberalization of Latin American financial markets. Finally, in areas where government oversight is important and necessary, progress has been made in improving supervision and regulation. In particular, effort has focused on strengthening and modernizing the region’s banking sectors, which emerged from the Debt Crisis of the 1980s in a highly debilitated state. In consequence, banks have gradually risen toward a higher international standard, assisted in part by an opening of domestic financial sectors to foreign competition and participation. Measured by various criteria, including capital/asset ratios and loan loss reserves, banking systems in various Latin American nations are significantly healthier than they were going into the Debt Crisis. While the region has further to go in strengthening its financial systems, and important problems remain, the progress achieved to date has been important to the overall recovery of Latin American economies in the 1990s. Having said this, Latin America has been affected by some major international financial crises in the last several years, and obviously Brazil is being affected currently. The lessons to be learned from these crises remain the subject of strenuous debate; I would offer only the following observations. –4– First, appropriate and balanced fiscal, monetary, and exchange rate policies continue to matter. The 1994–95 Mexican crisis and the current stresses on Brazil both indicate that markets watch closely countries that allow imbalances to emerge. In fact, in an era of open capital markets and rapid capital mobility, the punishment for policy mistakes arguably is even more rapid and severe than have been the case in the past. Second, it may not be enough for countries merely to avoid excessive budget deficits or high inflation. Countries must take steps to reduce their vulnerability to dislocations in the international financial system, including by raising domestic saving rates, reducing excessive levels of short-term debt, and increasing the degree of transparency and disclosure both in the public and private sector. Third, a healthy banking system is an integral part of participation in the modern financial environment, and also is a particularly important buffer against future financial shocks, both domestic and external. The question becomes how best to acquire such a system. Some countries have attempted to maintain a closed national system and build skills at home. Others have opened their banking market and allowed foreign competition to force the pace of modernization. In practice, the development of banking sectors is likely to involve some combination of local and foreign input, although increasingly, countries appear to be finding that foreign involvement provides important infusions both of expertise and competition. As well, countries are discovering a strengthening of the systems of banking supervision and regulation to be an indispensable part of the process of modernizing the financial sector. Open questions Many questions about how best to manage economies in an international setting are still open. The first question is what is the best way to enter the modern world of rapid capital mobility. Foreign capital obviously can be of tremendous benefit in helping economies to modernize and grow. At the same time, the tendency of foreign investors to pull their capital out of a country at the same time, as has recently been observed in various instances throughout the globe, can leave an economy in serious trouble. How can the benefits of foreign capital flows be retained while their adverse sideeffects are minimized? Some observers have pointed to the Chilean experience with capital controls. These controls, which have now been removed, were intended to discourage inflows of short-term capital, and along with measures to strengthen the banking system, may have helped insulate Chile’s economy from the recent round of financial disturbances. The jury is still out on whether such controls might be appropriate for other economies. It seems clear that their principal attraction is as a short-term measure that helps an economy make a full transition to a fully open capital market. A second question is whether fixed or more flexible exchange rate regimes are more appropriate in an era of rapid capital mobility. In principle, more flexible exchange rates allow economies to adjust more easily to changes in international economic and financial conditions, while fixed exchange rate systems may be useful in instilling a greater degree of discipline on the part of economic policymakers. In practice, the experience of the past year and a half has taught us that systems in which exchange rates are only halfheartedly fixed – that is, where fiscal and monetary policies are not geared toward supporting the currency – are the least sustainable and hence the least desirable. On the other hand, regimes in which economic policies are rigorously focused on maintaining an exchange rate peg may still have value in motivating prudent economic policies and in insulating economies from international financial turbulence. However, fixed rate foreign exchange regimes and currency boards require considerable internal discipline to work as intended. The benefits are potentially large, but the effort to maintain such a system and the risks associated with having to abandon such a system under duress are also large. –5– Conclusions To summarize, much of the story of the evolution of Latin American economies in the past decade has been a story about globalization. Globalization has operated on many different levels to improve the performance and productivity of the region. However, participation in the global financial market does, as we have seen, entail risks for countries that have not followed prudent policies. On balance, globalization clearly has been a strong positive force in the region’s economy. Latin America’s economies are sounder, more entrepreneurial, and more dynamic than they have been in many decades. And yet, grave problems remain. These include widespread poverty, tremendous disparities in wealth and income, and a level of per capita GDP that in many countries remains little higher than it was in the early 1980s. I believe that these problems are more likely to be resolved through continued participation in the global economy than by falling back on earlier models of economic development. In closing, since I am speaking to a group of bankers, I would like to reiterate the importance of good banks and good banking skills in contributing to the economic prospects for the region. By facilitating trade finance, providing funds for growing companies, and integrating domestic financial sectors into the global capital market, banks have played a key role in furthering the development of the Latin American economies. Latin American banking systems, benefiting in large part from rising levels of foreign participation, are providing increasingly competitive levels of credit and depository services, thereby laying the groundwork for future dynamism and growth. Healthy banking systems also are crucial to promoting economic stability. The recent experience of several Asian countries, where severely weakened banking systems helped contribute to financial crises, has reminded us that good banking skills and strong bank supervision are indispensable. Albert Hirschman’s wish for Latin American development, which is captured in the title of his book, A Bias for Hope: Essays on Development and Latin America, may eventually be fulfilled if the region stays the course and pursues sound fiscal and monetary policies and the needed improvements in underlying economic and financial institutions. * * *
|
board of governors of the federal reserve system
| 1,999 | 2 |
Testimony of the Chairman of the Board of the US Federal Reserve System, Mr Alan Greenspan, before the Committee on Banking, Housing, and Urban Affairs of the US Senate on 23/02/99.
|
Mr Greenspan presents the Federal Reserve’s semi-annual report on monetary policy to the US Senate Testimony of the Chairman of the Board of the US Federal Reserve System, Mr Alan Greenspan, before the Committee on Banking, Housing, and Urban Affairs of the US Senate on 23/02/99. Chairman and members of the Committee, I appreciate the opportunity to present the Federal Reserve’s semiannual report on monetary policy. The U.S. economy over the past year again performed admirably. Despite the challenges presented by severe economic downturns in a number of foreign countries and episodic financial turmoil abroad and at home, our real GDP grew about 4 percent for a third straight year. In 1998, 2¾ million jobs were created on net, bringing the total increase in payrolls to more than 18 million during the current economic expansion, which late last year became the longest in U.S. peacetime history. Unemployment edged down further to a 4¼ percent rate, the lowest since 1970. And despite taut labor markets, inflation also fell to its lowest rate in many decades by some broad measures, although a portion of this decline owed to decreases in oil, commodity, and other import prices that are unlikely to be repeated. Hourly labor compensation adjusted for inflation posted further impressive gains. Real compensation gains have been supported by robust advances in labor productivity, which in turn have partly reflected heavy investment in plant and equipment, often embodying innovative technologies. Can this favorable performance be sustained? In many respects the fundamental underpinnings of the recent U.S. economic performance are strong. Flexible markets and the shift to surplus on the books of the federal government are facilitating the build-up in cutting-edge capital stock. That build-up in turn is spawning rapid advances in productivity that are helping to keep inflation well behaved. The new technologies and the optimism of consumers and investors are supporting asset prices and sustaining spending. But, after eight years of economic expansion, the economy appears stretched in a number of dimensions, implying considerable upside and downside risks to the economic outlook. The robust increase of production has been using up our nation’s spare labor resources, suggesting that recent strong growth in spending cannot continue without a pick-up in inflation unless labor productivity growth increases significantly further. Equity prices are high enough to raise questions about whether shares are overvalued. The debt of the household and business sectors has mounted, as has the external debt of the country as a whole, reflecting the deepening current account deficit. We remain vulnerable to rapidly changing conditions overseas, which, as we saw last summer, can be transmitted to U.S. markets quickly and traumatically. I will be commenting on many of these issues as I review the developments of the past year and the prospects going forward. In light of all these risks, monetary policy must be ready to move quickly in either direction should we perceive imbalances and distortions developing that could undermine the economic expansion. Recent Developments A hallmark of our economic performance over the past year was the continuing sharp expansion of business investment spending. Competitive global markets and persisting technological advances both spurred the business drive to become more efficient and induced the price declines for many types of new equipment that made capital spending more attractive. Business success in enhancing productivity and the expectation of still further, perhaps accelerated, advances buoyed public optimism about profit prospects, which contributed to another sizable boost in equity prices. Rising household wealth along with strong growth in real income, related to better pay, slower inflation, and expanding job opportunities, boosted consumption at the fastest clip in a decade and a half. The gains in income and wealth last year, along with a further decrease in mortgage rates, also prompted considerable activity in the housing sector. The impressive performance of the private sector was reflected in a continued improvement in the federal budget. Burgeoning receipts, along with continuing restraint on federal spending, produced the first unified budget surplus in thirty years, allowing the Treasury to begin to pay down the federal debt held by the public. This shift in the federal government’s fiscal position has fostered an increase in overall national saving as a share of GDP to 17¼ percent from the 14½ percent low reached in 1993. This rise in national saving has helped to hold down real interest rates and to facilitate the financing of the boom in private investment spending. Foreign savers have provided an additional source of funds for vigorous domestic investment. The counterpart of our high and rising current account deficit has been ever-faster increases in the net indebtedness of U.S. residents to foreigners. The rapid widening of the current account deficit has some disquieting aspects, especially when viewed in a longer-term context. Foreigners presumably will not want to raise indefinitely the share of their portfolios in claims on the United States. Should the sustainability of the build-up of our foreign indebtedness come into question, the exchange value of the dollar may well decline, imparting pressures on prices in the United States. In the recent economic environment, however, the widening of the trade and current account deficits had some beneficial aspects. It provided a safety valve for strong U.S. domestic demand, thereby helping to restrain pressures on U.S. resources. It also cushioned, to some extent, economic weakness in our trading partners. Moreover, decreasing import prices, which partly came from the appreciation of the dollar through mid-summer, contributed to low overall U.S. inflation, as did ample manufacturing capacity in the United States and lower prices for oil and other commodities stemming from the weak activity abroad. The marked drop in energy prices significantly contributed to the subdued, less than 1 percent, increase in the price index for total personal consumption expenditures during 1998. In addition, supported by rapid accumulation of more efficient capital, the growth of labor productivity picked up last year, allowing nominal labor compensation to post another sizable gain without putting added upward pressure on costs and prices. I shall return to an analysis of the extraordinary performance of inflation later in my remarks. The Federal Open Market Committee conducted monetary policy last year with the aim of sustaining the remarkable combination of economic expansion and low inflation. At its meetings from March to July, the inflation risks accompanying the continued strength of domestic demand and the tightening of labor markets necessitated that the FOMC place itself on heightened inflation alert. Although the FOMC kept the nominal federal funds rate unchanged, it allowed the real funds rate to rise with continuing declines in inflation and, presumably, inflation expectations. In August, the FOMC returned to an unbiased policy predilection in response to the adverse implications for the U.S. outlook of worsening conditions in foreign economies and in global financial markets, including our own. Shortly thereafter, a further deterioration in financial market conditions began to pose a more serious threat to economic stability. In the wake of the Russian crisis and subsequent difficulties in other emerging market economies, investors perceived that the uncertainties in financial markets had broadened appreciably and as a consequence they became decidedly more risk averse. Safe-haven demands for U.S. Treasury securities intensified at the expense of private debt securities. As a result, quality spreads escalated dramatically, especially for lower-rated issuers. Many financial markets turned illiquid, with wider bid-asked spreads and heightened price volatility, and issuance was disrupted in some private securities markets. Even the liquidity in the market for seasoned issues of U.S. Treasury securities dried up, as investors shifted toward the more actively traded, recently issued securities and dealers pared inventories, fearing that heightened price volatility posed an unacceptable risk to their capital. Responding to losses in foreign financial markets and to pressures from counterparties, highly leveraged investors began to unwind their positions, which further weighed on market conditions. As credit became less available to business borrowers in capital markets, their demands were redirected to commercial banks, which reacted to the enlarged borrowing, and more uncertain business prospects, by tightening their standards and terms on such lending. To cushion the domestic economy from the impact of the increasing weakness in foreign economies and the less accommodative conditions in U.S. financial markets, the FOMC, beginning in late September, undertook three policy easings. By mid-November, the FOMC had reduced the federal funds rate from 5½ percent to 4¾ percent. These actions were taken to rebalance the risks to the outlook, and, in the event, the markets have recovered appreciably. Our economy has weathered the disturbances with remarkable resilience, though some yield and bid-asked spreads still reflect a hesitancy on the part of market participants to take on risk. The Federal Reserve must continue to evaluate, among other issues, whether the full extent of the policy easings undertaken last fall to address the seizing-up of financial markets remains appropriate as those disturbances abate. To date, domestic demand and hence employment and output have remained vigorous. Real GDP is estimated to have risen at an annual rate exceeding 5½ percent in the fourth quarter of last year. Although some slowing from this torrid pace is most likely in the first quarter, labor markets remain exceptionally tight and the economy evidently retains a great deal of underlying momentum despite the global economic problems and the still-visible remnants of the earlier financial turmoil in the United States. At the same time, no evidence of any upturn in inflation has, as yet, surfaced. Abroad, the situation is mixed. In some East Asian countries that, in recent years, experienced a loss of investor confidence, a severe currency depreciation, and a deep recession, early signs of stabilization and economic recovery have appeared. This is particularly the case for Korea and Thailand. Authorities in those countries, in the context of IMF stabilization programs, early on established appropriate macroeconomic policies and undertook significant structural reforms to buttress the banking system and repair the finances of the corporate sector. As investor confidence has returned, exchange rates have risen and interest rates have fallen. With persistence and follow-through on reforms, the future of those economies has promise. The situations in some other emerging market economies are not as encouraging. The Russian government’s decision in mid-August to suspend payments on its domestic debt and devalue the ruble took markets by surprise. Investor flight exacerbated the collapse of prices in Russian financial markets and led to a sharp depreciation of the ruble. The earlier decline in output gathered momentum, and by late in the year inflation had moved up to a triple-digit annual rate. Russia’s stabilization program with the IMF has been on hold since the financial crisis hit, and the economic outlook there remains troubling. The Russian financial crisis immediately spilled over to some other countries, hitting Latin America especially hard. Countering downward pressure on the exchange values of the affected currencies, interest rates moved sharply higher, especially in Brazil. As a consequence of the high interest rates and growing economic uncertainty, Brazil’s economic activity took a turn for the worse. Higher interest rates also had negative consequences for the fiscal outlook, as much of Brazil’s substantial domestic debt effectively carries floating interest rates. With budget reform legislation encountering various setbacks, market confidence waned further and capital outflows from Brazil continued, drawing down foreign currency reserves. Ultimately, the decision was taken to allow the real to float, and it subsequently depreciated sharply. Brazilian authorities must walk a very narrow, difficult path of restoring confidence and keeping inflation contained with monetary policy while dealing with serious fiscal imbalances. Although the situation in Brazil remains uncertain, there has been limited contagion to other countries thus far. Apparently, the slow onset of the crisis has enabled many parties with Brazilian exposures to hedge those positions or allow them to run off. With the net exposure smaller, and increasingly held by those who both recognized the heightened risk and were willing to bear it, some of the elements that might have contributed to further contagion may have been significantly reduced. The Economic Outlook These recent domestic and international developments provide the backdrop for U.S. economic prospects. Our economy’s performance should remain solid this year, though likely with a slower pace of economic expansion and a slightly higher rate of overall inflation than last year. The stocks of business equipment, housing, and household durable goods have been growing rapidly to quite high levels relative to business sales or household incomes during the past few years, and some slowing in the growth of spending on these items seems a reasonable prospect. Moreover, part of the rapid increase in spending, especially in the household sector, has resulted from the surge in wealth associated with a run-up in equity prices that is unlikely to be repeated. And the purchasing power of income and wealth has been enhanced by declines in oil and other import prices, which also are unlikely to recur this year. Assuming that aggregate demand decelerates, underlying inflation pressures, as captured by core price measures, in all likelihood will not intensify significantly in the year ahead, though the Federal Reserve will need to monitor developments carefully. We perceive stable prices as optimum for economic growth. Both inflation and deflation raise volatility and risks that thwart maximum economic growth. Most Governors and Reserve Bank Presidents foresee that economic growth this year will slow to a 2½ to 3 percent rate. Such growth would keep the unemployment rate about unchanged. The central tendency of the Governors’ and Presidents’ predictions of CPI inflation is 2 to 2½ percent. This level represents a pick-up from last year, when energy prices were falling, but it is in the vicinity of core CPI inflation over the last couple of years. This outlook involves several risks. The continuing downside risk posed by possible economic and financial instability around the world was highlighted earlier this year by the events in Brazil. Although financial contagion elsewhere has been limited to date, more significant knock-on effects in financial markets and in the economies of Brazil’s important trading partners, including the United States, are still possible. Moreover, the economies of several of our key industrial trading partners have shown evidence of weakness, which if it deepens could further depress demands for our exports. Another downside risk is that growth in capital spending, especially among manufacturers, could weaken appreciably if pressures on domestic profit margins mount and capacity utilization drops further. And it remains to be seen whether corporate earnings will disappoint investors, even if the slowing of economic growth is only moderate. Investors appear to have incorporated into current equity price levels both robust profit expectations and low compensation for risk. As the economy slows to a more sustainable pace as expected, profit forecasts could be pared back, which together with a greater sense of vulnerability in business prospects could damp appetites for equities. A downward correction to stock prices, and an associated increase in the cost of equity capital, could compound a slowdown in the growth of capital spending. In addition, a stock market decline would tend to restrain consumption spending through its effect on household net worth. But on the upside, our economy has proved surprisingly robust in recent years. More rapid increases in capital spending, productivity, real wages, and asset prices have combined to boost economic growth far more and far longer than many of us would have anticipated. This “virtuous cycle” has been able to persist because the behavior of inflation also has been surprisingly favorable, remaining well contained at levels of utilization of labor that in the past would have produced accelerating prices. That it has not done so in recent years has been the result of a combination of special one-time factors holding down prices and more lasting changes in the processes determining inflation. Among the temporary factors, the sizable declines in the prices of oil, other internationally traded commodities, and other imports contributed directly to holding down inflation last year, and also indirectly by reducing inflation expectations. But these prices are not likely to fall further, and they could begin to rise as some Asian economies revive and the effects of the net depreciation of the dollar since mid-summer are felt more strongly. At the same time, however, recent experience does seem to suggest that the economy has become less inflation prone than in the past, so that the chances of an inflationary breakout arguably are, at least for now, less than they would have been under similar conditions in earlier cycles. Several years ago I suggested that worker insecurity might be an important reason for unusually damped inflation. From the early 1990s through 1996, survey results indicated that workers were becoming much more concerned about being laid off. Workers’ underlying fear of technology-driven job obsolescence, and hence willingness to stress job security over wage increases, appeared to have suppressed labor cost pressures despite a reduced unemployment rate. More recently, that effect seems to have diminished in part. So while job loss fears probably contributed to wage and price suppression through 1996, it does not appear that a further heightening of worker insecurity about employment prospects can explain the more recent improved behavior of inflation. Instead, a variety of evidence, anecdotal and otherwise, suggests that the source of recent restrained inflation may be emanating more from employers than from employees. In the current economic setting, businesses sense that they have lost pricing power and generally have been unwilling to raise wages any faster than they can support at current price levels. Firms have evidently concluded that if they try to increase their prices, their competitors will not follow, and they will lose market share and profits. Given the loss of pricing power, it is not surprising that individual employers resist pay increases. But why has pricing power of late been so delimited. Monetary policy certainly has played a role in constraining the rise in the general level of prices and damping inflation expectations over the 1980s and 1990s. But our current discretionary monetary policy has difficulty anchoring the price level over time in the same way that the gold standard did in the last century. Enhanced opportunities for productive capital investment to hold down costs also may have helped to damp inflation. Through the 1970s and 1980s, firms apparently found it easier and more profitable to seek relief from rising nominal labor costs through price increases than through cost-reducing capital investments. Price relief evidently has not been available in recent years. But relief from cost pressures has. The newer technologies have made capital investment distinctly more profitable, enabling firms to substitute capital for labor far more productively than they would have a decade or two ago. Starting in 1993, capital investment, especially in high-tech equipment, rose sharply beyond normal cyclical experience, apparently the result of expected increases in rates of return on the new investment. Had the profit expectations not been realized, one would have anticipated outlays to fall back. Instead, their growth accelerated through the remainder of the decade. More direct evidence confirms improved underlying profitability. According to rough estimates, labor and capital productivity has risen significantly during the past five years. It seems likely that the synergies of advances in laser, fiber optic, satellite, and computer technologies with older technologies have enlarged the pool of opportunities to achieve a rate of return above the cost of capital. Moreover, the newer technologies have facilitated a dramatic foreshortening of the lead times on the delivery of capital equipment over the past decade, presumably allowing businesses to react more expeditiously to an actual or expected rise in nominal compensation costs than, say, they could have in the 1980s. In addition, the surge in investment not only has restrained costs, it has also increased industrial capacity faster than factory output has risen. The resulting slack in product markets has put greater competitive pressure on businesses to hold down prices, despite taut labor markets. The role of technology in damping inflation is manifest not only in its effects on U.S. productivity and costs, but also through international trade, where technological developments have progressively broken down barriers to cross-border trade. The enhanced competition in tradable goods has enabled excess capacity previously bottled up in one country to augment worldwide supply and exert restraint on prices in all countries’ markets. The resulting price discipline also has constrained nominal wage gains in internationally tradable goods industries. As workers have attempted to shift to other sectors, gains in nominal wages and increases in prices in nontradable goods industries have been held down as well. The process of price containment has potentially become, to some extent, self-reinforcing. Lower inflation in recent years has altered expectations. Workers no longer believe that escalating gains in nominal wages are needed to reap respectable increases in real wages, and their remaining sense of job insecurity is reinforcing this. Since neither firms nor their competitors can count any longer on a general inflationary tendency to validate decisions to raise their own prices, each company feels compelled to concentrate on efforts to hold down costs. The availability of new technology to each company and its rivals affords both the opportunity and the competitive necessity of taking steps to boost productivity. It is difficult to judge whether these significant shifts in the market environment in which firms function are sufficient to account for our benign overall price behavior during the past half decade. Undoubtedly, other factors have been at work as well, including those temporary factors I mentioned earlier and some more lasting I have not discussed, such as worldwide deregulation and privatization, and the freeing-up of resources previously employed to produce military products that was brought about by the end of the cold war. There also may be other contributory forces lurking unseen in the wings that will only become clear in time. Over the longer run, of course, the actions of the central bank determine the degree of overall liquidity and hence rate of inflation. It is up to us to validate the favorable inflation developments of recent years. Although the pace of productivity increase has picked up in recent years, the extraordinary strength of demand has meant that the substitution of capital for labor has not prevented us from rapidly depleting the pool of available workers. This worker depletion constitutes a critical upside risk to the inflation outlook because it presumably cannot continue for very much longer without putting increasing pressure on labor markets and on costs. The number of people willing to work can be usefully defined as the unemployed component of the labor force plus those not actively seeking work, and thus not counted in the labor force, but who nonetheless say they would like a job if they could get one. This pool of potential workers aged 16 to 64 currently numbers about 10 million, or just 5¾ percent of that group’s population – the lowest such percentage on record, which begins in 1970, and 2½ percentage points below its average over that period. The rapid increase in aggregate demand has generated growth of employment in excess of growth in population, causing the number of potential workers to fall since the mid-1990s at a rate of a bit under 1 million annually. We cannot judge with precision how much further this level can decline without sparking ever greater upward pressures on wages and prices. But, should labor market conditions continue to tighten, there has to be some point at which the rise in nominal wages will start increasingly outpacing the gains in labor productivity, and prices inevitably will begin to accelerate. Ranges for Money and Credit At its February meeting, the Committee elected to ratify the provisional ranges for all three aggregates that it had established last July. Specifically, the Committee again has set growth rate ranges over the four quarters of 1999 of 1 to 5 percent for M2, 2 to 6 percent for M3, and 3 to 7 percent for domestic nonfinancial debt. As in previous years, the Committee interpreted the ranges for the broader monetary aggregates as benchmarks for what money growth would be under conditions of price stability and sustainable economic growth, assuming historically typical velocity behavior. Last year, these monetary aggregates far overshot the upper bounds of their annual ranges. While nominal GDP growth did exceed the rate likely consistent with sustained price stability, the rapid growth of M2 and M3 also reflected outsized declines in their velocities, that is, the ratio of nominal GDP to money. M2 velocity dropped by about 3 percent, while M3 velocity plunged by 5¼ percent. Part of these velocity declines reflected some reduction in the opportunity cost of holding money; interest rates on Treasury securities, which represent an alternative return on non-monetary assets, dropped more than did the average of interest rates on deposits and money market mutual funds in M2, drawing funds into the aggregate. Even so, much of last year’s aberrant behavior of broad money velocity cannot readily be explained by conventional determinants. Although growth of the broad aggregates was strong earlier in the year, it accelerated in the fourth quarter after credit markets became turbulent. Perhaps robust money growth late in the year partly reflected a reaction to this turmoil by the public, who began scrambling for safer and more liquid financial assets. Monetary expansion has moderated so far this year, evidently in lagged response to the calming of financial markets in the autumn. Layered on top of these influences, though, the public also may have been reapportioning their savings flows into money balances because the huge run-up in stock prices in recent years has resulted in an uncomfortable portion of their net worth in equity. For the coming year, the broad monetary aggregates could again run high relative to these ranges. To be sure, the decline in the velocities of the broader aggregates this year should abate to some extent, as money demand behavior returns more to normal, and growth in nominal GDP should slow as well, as suggested by the Governors’ and Presidents’ central tendency. Both factors would restrain broad money expansion relative to last year. Still, the growth of M2 and M3 could well remain outside their price-stability ranges this year. Obviously, considerable uncertainty continues to surround the prospective behavior of monetary velocities and growth rates. Domestic nonfinancial debt seems more likely than the monetary aggregates to grow within its range for this year. Indeed, domestic nonfinancial debt also could grow more slowly this year than last year’s 6¼ percent pace, which was in the upper part of its 3 to 7 percent annual range. With the federal budget surplus poised to widen further this year, federal debt should contract even more quickly than last year. And debt in each of the major nonfederal sectors in all likelihood will decelerate as well from last year’s relatively elevated rates, along with the projected slowing of nominal GDP growth. The FOMC’s Disclosure Policy The FOMC at recent meetings has discussed not only the stance of policy, but also when and how it communicates its views of the evolving economic situation to the public. The FOMC’s objective is to release as much information about monetary policy decision making, and as promptly, as is consistent with maintaining an effective deliberative process and avoiding roiling markets unnecessarily. Since early 1994, each change in the target nominal federal funds rate has been announced immediately with a brief rationale for the action. The FOMC resolved at its December meeting to take advantage of an available, but unused policy, originally stated in early 1994, of releasing, on an infrequent basis, a statement immediately after some FOMC meetings at which the stance of monetary policy has not been changed. The Federal Reserve will release such a statement when it wishes to communicate to the public a major shift in its views about the balance of risks or the likely direction of future policy. Such an announcement need not be made after every change in the tilt of the directive. Instead, this option would be reserved for situations in which the consensus of the Committee clearly had shifted significantly, though not by enough to change current policy, and in which the absence of an explanation risked misleading markets about the prospects for monetary policy. Year 2000 Issues Before closing, I’d like to address an issue that has been receiving increasing attention – the century date change. While no one can say that the rollover to the year 2000 will be trouble free, I am impressed by the efforts to date to address the problem in the banking and financial system. For our part, the Federal Reserve System has now completed remediation and testing of 101 of its 103 mission-critical applications, with the remaining two to be replaced by the end of March. We opened a test facility in June at which more than 6000 depository institutions to date have conducted tests of their Y2K compliant systems, and we are well along in our risk mitigation and contingency planning activities. As a precautionary measure, the Federal Reserve has acted to increase the currency in inventory by about one-third to approximately $200 billion in late 1999 and has other contingency arrangements available if needed. While we do not expect currency demand to increase dramatically, the Federal Reserve believes it is important for the public to have confidence in the availability of cash in advance of the rollover. As a result of these kinds of activities, I can say with assurance that the Federal Reserve will be ready in both its operations and planning activities for the millennium rollover. The banking industry is also working hard, and with evident success, to prepare for the event. By the end of the first quarter, every institution in the industry will have been subject to two rounds of onsite Y2K examinations. The Federal Reserve, like the other regulators, has found that only a small minority of institutions have fallen behind in their preparations, and those institutions have been targeted for additional follow-up and, as necessary, formal enforcement actions. The overwhelming majority of the industry has made impressive progress in its remediation, testing, and contingency planning efforts. Concluding Comment Americans can justifiably feel proud of their recent economic achievements. Competitive markets, with open trade both domestically and internationally, have kept our production efficient and on the expanding frontier of technological innovation. The determination of Americans to improve their skills and knowledge has allowed workers to be even more productive, elevating their real earnings. Macroeconomic policies have provided a favorable setting for the public to take greatest advantage of opportunities to improve their economic well being. The restrained fiscal policy of the Administration and the Congress has engendered the welcome advent of a unified budget surplus, freeing up funds for capital investment. A continuation of responsible fiscal and, we trust, monetary policies should afford Americans the opportunity to make considerable further economic progress over time.
|
board of governors of the federal reserve system
| 1,999 | 2 |
Remarks by Mr Roger W Ferguson Jr, a member of the Board of Governors of the US Federal Reserve System, at the Money Marketeers of New York University, New York, on 25/2/99.
|
Mr Ferguson discusses the evolution of financial institutions and markets: implications for public and private policies and practices Remarks by Mr Roger W Ferguson Jr, a member of the Board of Governors of the US Federal Reserve System, at the Money Marketeers of New York University, New York, on 25/2/99. This evening I would like to spend a few minutes discussing some of the implications of the rapid and ongoing evolution of our financial institutions and markets. New financial instruments, innovations in portfolio management, and the technological capability of implementing new risk management strategies offer opportunities to reduce risk and to improve efficiency by allocating risk to those most willing to accept it. Reductions of trade barriers and the freer flow of financial capital around the world mean better resource allocation, improved productivity, and higher standards of living for citizens of the United States and many other nations. While the current and potential benefits of financial and technological change are real and substantial, they do not come without some costs. For example, the rapid pace of technological change and the wide array of innovative financing techniques have in some ways made it more difficult for outside investors and policymakers to evaluate the risks borne by individual institutions and the broader markets. And, nearly instantaneous communications and heightened interdependencies among nations speed the effects of poor investment and policy decisions around the globe. Recent experiences in Asia, Russia, Latin America, and at home have taught us a lot about the risks of an increasingly interdependent world linked by complex financial relationships. In this rapidly evolving world of inevitable benefits and costs, the key question for both financial policymakers and market participants is: How can we retain the benefits of rapid technological and financial innovation and of freer movements of goods and financial capital across national borders, while simultaneously protecting our financial institutions and markets from the risks that these changes might bring? This is not a new question, and I am sure that most of this audience is well aware that many efforts are under way in both the public and private sectors to address the variety of issues that this question evokes. As always, there are no simple answers. But I believe that a number of fundamental principles have emerged that should be used to help shape both public and private policies and practices. Private Market Discipline Is the First Line of Defense Perhaps the most fundamental principle that must guide us is that private market participants are the first line of defense against excessive private and public risk in the financial system. Private borrowers, lenders, investors, institutions, traders, brokers, exchanges, and clearing systems all have huge stakes in containing their risks as individual agents and risks to the system as a whole. Private market participants can discourage excessive risk taking by choosing to do business with those firms that demonstrate sound risk management systems and portfolios that balance appropriately risk and expected return. If private markets are going to perform their risk control functions, then market prices must send the right signals to all participants about the risks and rewards of a particular transaction or at a given firm. In order to improve the ability of market prices to accurately reflect risks and returns, I believe that we should take action ourselves and encourage action by others in at least three areas. First, we should seek ways of improving the transparency of financial institutions and markets. As we all know, full information is a fundamental requirement of free and competitive markets. More particularly, financial institutions and individual investors must be well informed about their own and their counterparties’ exposures, the nature of new financial instruments, and the extent of overall market liquidity. I believe that banks and other financial institutions could significantly improve their disclosures by providing more information to the market about their risk management policies and practices and about the values of internal risk measures. At present, the market seems to grant great weight to bank regulatory capital ratios that are only crude indicators of an institution’s risk profile. That attention is driven in large part by the fact that these regulatory measures provide a consistent basis for comparison. The regulatory and financial communities should work together to identify more meaningful statistics to meet the market’s needs. At the international level much work is being done, and much remains. One of the key lessons of our most recent financial crises is that international accounting and public disclosure standards are often inadequate. An important step forward was the publication last November by the international Basle Committee on Banking Supervision of guidelines for enhancing bank transparency. That report provides guidance to banks and banking supervisors around the world on the nature of core disclosures that should be provided to the public. Much more, however, should be done to provide the public and supervisors with the information they need to exert effective market discipline. A second area where we could improve market discipline is in affecting how market participants view what has come to be known as the too-big-to-fail problem. In this regard, I would emphasize that the FDIC Improvement Act of 1991, or FDICIA, contains rather tough language about too-big-to-fail – language that I assure you the Board takes very seriously. Perhaps it would be useful to review briefly what the law says. FDICIA requires that, regardless of the size of a bank, the bank resolution method chosen by the FDIC be the least costly of all possible methods for meeting the FDIC's obligation to protect insured deposits. In addition, FDICIA prohibits the FDIC from assisting uninsured depositors and creditors, or shareholders of an insured depository institution. Add to these FDICIA provisions the depositor preference provisions in the Omnibus Budget Reconciliation Act of 1993, and you have some rather potent reasons for the market to be disciplined in its dealings with insured depositories. The only exception to these obligations is the so-called “systemic risk exception.” But the systemic risk exception is quite tough and explicit. It requires concurrence by two-thirds of the Federal Reserve Board, two-thirds of the FDIC Board, and the Secretary of Treasury in consultation with the President that conformance with least-cost resolution would “have serious adverse effects on economic conditions or financial activity” before the FDIC is allowed to “take other action or provide assistance as necessary to avoid or mitigate such effects.” In addition, if the systemic risk exception is used, any insurance fund losses arising from such exceptional actions must be recovered through special assessments on all depository institutions that are members of the relevant federal deposit insurance fund. Lastly, the Comptroller General must subsequently review the agencies' actions and report its findings to Congress. The sum total of these conditions establishes, in my view, a rather high hurdle that must be cleared before the systemic risk exception can be used. The FDICIA and other reforms have, I believe, altered market perceptions of too-big-to-fail. Nonetheless, the obvious need for the central bank and other government agencies to prevent a systemic collapse of the banking and financial system, the creation of seemingly ever-larger financial organizations, and the inherent uncertainties involved in the management of any crisis leave room for doubt in some observers’ minds. Perhaps by its very nature this is an issue that can never be fully resolved. But it seems clear to me that any institution, regardless of size, can fail in the sense that existing stockholders can lose their total investment, existing management can be replaced, and uninsured creditors can suffer losses. In some cases it may be necessary for an institution to stay in operation and be wound down in an orderly way over a transition period. Ultimately, the institution could be sold in whole or in part. But even in such cases the expectation of owners, managers, and uninsured creditors should be that real and significant losses will be incurred. In my judgment, if policies consistent with these principles are followed, then we will have eliminated much of the moral hazard associated with the federal safety net for depository institutions while simultaneously being able to achieve our goal of preserving financial stability. One way to enhance the ability of market participants to limit risk taking by banks might be to require at least the largest and most complex banks to issue a minimum amount of subordinated debt. Many such proposals have surfaced over the last decade, including some from within the Federal Reserve System. And while I think that it is premature to endorse any one proposal, indeed there are a number of thorny details that would need to be worked out, I am strongly attracted to the basic concept advanced by proponents of subordinated debt. The fundamental notion behind requiring at least some banks to issue traded subordinated debt is to create a set of uninsured bank creditors whose risk preferences are similar to those of bank supervisors. Because subordinated debt holders have downside risk, but virtually no upside potential, subordinated debt holders tend to be risk averse in much the same way as is the FDIC. Thus, when a bank sought to issue subordinated debt the price that investors were willing to pay would bring direct market discipline aimed at controlling excessive risk taking by the bank. A second key objective is to create a set of market instruments that would provide clear, and frequent, signals of the market’s evaluation of a bank’s financial condition. Such signals could act as a deterrent to a bank’s tendency to take excessive risk, and could perhaps alert bank supervisors to examine, or otherwise intervene in, a bank more quickly than they otherwise would. Changes in the market prices of traded bank subordinated debt, and perhaps other actions by the owners of this debt, have the potential to provide such signals. In this way subordinated debt could be used to bring indirect market discipline on a bank. Supervisory Discipline Must Be An Effective and Dynamic Second Line of Defense While market discipline must be our first line of defense for ensuring financial stability, bank supervision also has an important role to play. The very nature of a systemic disruption, which imposes costs on not only the perpetrators, but also on many and diverse economic agents far removed from the immediate event, means that market participants find it impossible to fully incorporate systemic risks into market prices. Indeed, it is this very aspect of systemic risk that justifies the existence of a government safety net for depository institutions. The inevitable moral hazard of the safety net requires that bank supervisors have the ability to exert supervisory discipline on the riskiness of banks. I would like to comment tonight on what I consider two of the most pressing needs in the bank supervisory area: the need to make capital standards more risk sensitive, and the need to focus supervisory practice more on risk measurement and management. I need not explain to this audience why the maintenance of strong capital positions is critical to the preservation of a safe and sound banking system. Indeed, ensuring strong capital has been a cornerstone of bank supervision for decades. However, the development by some of our largest and most complex banks of increasingly sophisticated models for measuring, managing, and pricing risk has called into question the continuing usefulness of the current capital standards – the so-called risk-based capital standards – that are part of the Basle Accord. The Basle Accord capital standards were adopted in 1988 by most of the world's industrialized nations in an effort to encourage stronger capital at riskier banks, to include offbalance sheet exposures in the assessment of capital adequacy, and to establish more consistent capital standards across nations. The Accord was a major advance in 1988, and initially proved to be very useful in promoting safety and soundness goals. But in recent years calls for reform have begun to grow. I will outline briefly one of the key problems we are currently facing with the Basle Accord. The Basle Accord divides on- and off-balance sheet assets of banks into four risk buckets, and then applies a different capital weight to each bucket. The basic idea is that more capital should be required to be held against riskier assets. However, the relationship is rough. Perhaps most troublesome, the same risk weight is applied to all loans. Thus, for example, a loan to a very risky “junk bond” company gets the same weight as a loan to a “triple A” rated firm. While the Accord has the virtue of being relatively easy to administer, it also clearly gives banks an incentive to sell or not to originate loans that their internal models say need less capital than is required by the Basle Accord. Conversely, banks are encouraged to book loans that their models say require more capital than does the Basle standard. Not surprisingly, some banks have devoted substantial resources to attempting to achieve both adjustments to their portfolios. The resulting “regulatory arbitrage” surely causes some serious problems. For one thing, it makes reported capital ratios – a key measure of bank soundness used by supervisors and investors – less meaningful for government supervisors and private analysts. Efforts are currently under way to redress many of the deficiencies in the current Basle Accord. But many of the issues are complex, and the optimal changes are still unclear. A consensus does seem to have developed that the Accord must be more risk sensitive. But how risk sensitive, and how should that risk sensitivity best be implemented? I foresee a multistaged process with perhaps some modest and relatively noncontroversial “fixes” being proposed, possibly in the very near future, and more fundamental reforms being developed and implemented over a period of several years. Given the dynamic nature of change in the financial sector, such a phased, or evolutionary, approach to revising the Accord is probably not just the only practical strategy, but also the most prudent as well. The need for an evolutionary approach can be made with perhaps even more force to other supervisory policies and procedures. For example, the increasing use and sophistication of credit risk models at the largest and most complex domestic and foreign banks has profound implications for supervisory activities as well as capital regulation. Understanding and evaluating credit risk models and related risk measurement techniques are quickly becoming required skills of bank supervisors. This need is fueled by the ever-growing array of securitizations, credit derivatives, remote originations, financial guarantees, and a seemingly endless stream of other financial innovations. Add to this the fantastic speed with which financial transactions can now be conducted, and one begins to get a feel for the many challenges facing bank supervisors. With these realities in mind, supervisory practices at all of the banking agencies are changing. Oversight of banks has become much more continuous and risk-focused than even a few years ago, especially at the largest and most complex organizations. It is recognized that we can no longer rely on periodic on-site examinations to ensure that these large institutions are operating in a safe and sound manner, but rather must be assured that their risk management practices and internal controls are sound on an ongoing basis. Still, on-site examinations, in my judgment, remain critical. However, on-site examinations must evolve to be both as effective and as unobtrusive as possible. We are devoting substantial efforts to attracting, training, retaining, and using effectively the highly skilled personnel that modern bank examinations require. The new procedures place greater importance on an institution’s internal management processes, beginning at the top. Consistent with the view that private agents are the first line of defense against excessive risk, boards of directors are expected to be actively involved in establishing the overall environment for taking risk, staying informed about the level of risks and how they are managed and controlled, and making sure that senior management is capable and has the resources it needs to be successful. Management is expected to develop and implement the policies and procedures needed to conduct a banking business in a safe and sound manner. Internal controls, evaluated by an independent internal auditing function, must be sound. Conclusion I hope that my brief review of some of the key challenges facing economic policymakers and private participants in banking and financial markets has convinced you, if indeed you needed convincing, that these aspects of the modern world of finance are important, exciting, and deserve the serious attention of all participants. The rewards, current and potential, of modern banking and finance are great. But there are also some very real risks that we need to address in effective, cooperative, and inevitably evolving ways.
|
board of governors of the federal reserve system
| 1,999 | 3 |
Remarks by Mr Roger W Ferguson Jr, a member of the Board of Governors of the US Federal Reserve System, at the Annual Policy Seminar of the National Association for Business Economics, Washington DC, on 4/3/99.
|
Mr Ferguson reviews the historical perspective and policy implications of global financial integration Remarks by Mr Roger W Ferguson Jr, a member of the Board of Governors of the US Federal Reserve System, at the Annual Policy Seminar of the National Association for Business Economics, Washington DC, on 4/3/99. Global Financial Integration: Historical Perspective and Policy Implications Thank you very much for inviting me to speak to this distinguished group of business economists and analysts. As we have discovered over the last eighteen months or so, we live in a rapidly changing global economic and financial environment. Today, given the theme of your conference, I would like to put the events of the last few years into a broader historical context, and ask what are the most appropriate policy solutions. Yesterday and Today Let me turn to a previous period of close global linkage in economic and financial markets: the forty years or so prior to World War I. Here I will draw on a recent NBER working paper by Michael Bordo, Barry Eichengreen and Jongwoo Kim.1 During this time, capital flowed essentially without restriction, chiefly from developed Western Europe to regions in the then rapidly developing countries of the Americas and to Australia. Just to put this in perspective, these three economists report that at its peak, the outflow from the United Kingdom reached 9 percent of GNP and was almost as high in France, Germany and the Netherlands. Much of this capital was invested in bonds that financed railroads and other infrastructure investments and in long-term government debt instruments, but there was also significant direct foreign investment. Bordo, Eichengreen and Kim find that one element of the pre-1914 episode of global financial integration is the size and persistence of the current account deficits of capital importing countries, mainly Australia, Canada, Argentina and the Scandinavian countries. Similarly, the current account surpluses of some capital exporters, particularly the United Kingdom, showed considerable persistence. These economic historians indicate that it was not unusual for these countries to experience current account imbalances, deficits for the capital importers or surpluses for the capital exporters, of between 5 percent and 10 percent of GDP lasting for several years. The size and persistence of these capital account imbalances in the pre-1914 period appear to compare roughly with the size of current account imbalances that have existed in the most recent period of global financial integration. Economic historians indicate that during the period from 1971 to 1997, emerging market countries were running, on average, current account deficits of about 4 percent of their GDPs. However, according to these historians, capital flows to the emerging market countries of today have been somewhat more variable – as measured by the standard deviation of the current account to GDP ratio – than those provided to the “emerging market economies” of the late 1800s and early 1900s. While the pre-1914 period can be characterized as more stable in some respects, economists have identified other important differences between that period of global capital flows and our own. First, gross flows are far greater today. A second difference is in the composition of the investment. While data are not complete, it does appear that the bulk – possibly as much as 85 percent according to Bordo, Eichengreen and Kim, Was There Really an Earlier Period of International Financial Integration Comparable to Today? National Bureau of Economic Research Working Paper 6738, September 1998. estimates reported by Bordo, Eichengreen and Kim – of pre-1914 investments were in government and municipal securities, railways, resource-extraction companies and public utilities. Today, it is safe to estimate that a larger proportion of investments are in commercial, industrial and financial concerns, whose assets are more intangible and whose operations are less transparent. Third, prior to 1914, the vast majority of portfolio capital flows took the form of bonds. More recently, portfolio flows have been more evenly split between bonds and stocks. Lessons for Policymakers Policymakers need to understand this earlier stage of global capital flows and should attempt to adapt any lessons to today’s world of absolutely large, very liquid, and fundamentally less transparent crossborder capital flows. The core question, obviously, is what factors allowed the apparent persistence and relative stability of the pre-1914 era of capital flows? One theory advanced by academics is that capital exports and investments went largely to countries linked through shared language, culture, legal systems and accounting regimes. Another hypothesis may be that, in the period leading up to World War I, the commitment of policymakers to stable monetary and fiscal policies was highly credible to market participants, perhaps reflecting a shared belief in and commitment to the gold standard. The policymakers’ challenge today is to build strong institutional structures, policy credibility, and private sector discipline that are at least as strong as might have existed earlier. However, we must do so against a background of modern, technologically driven, global finance. It is clear that we must make the adjustments required because, with the absolute scale of global financial markets increasing, the size of bailout packages needed to fill financing gaps during international crises will get progressively larger and become less feasible to assemble. In addition, policymakers must always be concerned about the moral hazard of appearing to stand ready to help private sector debtors and creditors escape from problems of their making. Finally, I believe that we must achieve these goals without attempting to step back to policy regimes of prior periods that are clearly unworkable now or that undermine the advantages that we know come with a free flow of capital. So what are the policy solutions that meet these criteria? In overview, I would say that there probably are very few grandiose new solutions that need to be implemented as a policy response to today’s global financial turmoil. Rather, I see an emphasis on basic, sound practices. The Need for Sound Institutional Underpinnings One lesson suggested by the pre-1914 period of relative stability in capital flows is the importance of broadly understood and accepted institutional arrangements. Countries that wish to participate in the modern global financial market need to modernize their institutional underpinnings. Several actions come to mind. First, the basic underpinnings of private property protection need to be in place. The legal system needs to define and support the enforceability of contracts. For example, while international bankruptcy procedures do not seem feasible, the adoption or improvement of national bankruptcy systems is essential and should have a high priority. Debtors and creditors alike need to have an understanding of the rules under which private sector defaults are to be resolved and confidence that these rules will be consistently applied. I put the highest priority on bankruptcy procedures because they provide the self-adjusting mechanisms that allow the private sector to continue to function even in a world in which financial crises are likely to occur. Various plans to improve international financial rules will emerge, but they must all, in my judgment, include a focus on national bankruptcy codes. Similarly, accounting standards and private sector disclosure are important areas for improvement. Accounting standards provide the foundation for credible financial statements and other disclosures that are key means for communicating a company’s operating results and its overall health, as well as for making more transparent various operating activities. Disclosure of reliable information facilitates market discipline, strengthens confidence, improves decision making and reduces the chance that rumors and misleading information could cause instability in national and global markets. Also, by facilitating market discipline, public disclosure helps reinforce supervisory efforts to encourage banks and other companies to maintain sound risk management practices and internal controls. For these reasons, the Federal Reserve and the Basle Committee on Banking Supervision have strongly supported initiatives to improve the quality of national and international accounting standards and disclosure practices. Finally, bank supervision and regulation must be upgraded. Banks are an important mechanism for transmitting funds both domestically and internationally. However, in societies in which banks are afforded certain benefits and safety nets because banking problems have the potential to disrupt financial markets, banks can perform their intermediary role best if they are properly supervised. This supervision, in turn, should start with oversight by depositors and participants in private debt and equity markets and be reinforced by appropriate supervisory practices and safeguards. Assessing the soundness of banking institutions is difficult, simply because of the nature of the banking business. Extending credit inherently involves subjective judgments, and standards of adequate risk measurement and disclosure must continually evolve along with the nature and complexity of banking products and the ability of institutions to assimilate information. Both authorities and market participants must request and receive sufficiently accurate, timely, and relevant information on which to base their decisions. They must also have sufficient insight and expertise to evaluate it properly. No discussion of institutional arrangements in the global economic and financial system will be complete without a discussion of the IMF, the international financial institution central to resolving balance of payments crises. If the IMF did not exist we would probably need to create something like it. In a world of global finance, it is useful to have a multilateral organization that has some responsibility for providing liquidity assistance to countries when they do not have access to capital markets and also encouraging countries to undertake prudent fiscal and monetary policies. The IMF has come under some criticism for elements of its policy prescriptions. I am sure that economists and policymakers will have opportunities to review and evaluate the performance of the IMF during this period. It might be appropriate to define more clearly the functional roles played by the IMF and other international financial institutions in order to improve the efficient use of scarce international resources. At the present time, however, the IMF is a critical participant in maintaining international financial stability. The Need for Credible Economic and Financial Policies A second lesson of history and recent experience is that policymakers must follow sound and credible macroeconomic, microeconomic and financial policies. In particular, a necessary condition is fiscal and monetary policies that are, and are perceived to be, responsible. As we have seen in some of the crises that have hit in global financial markets, particularly those that have a public sector origin, heavy short-term borrowing by governments, when coupled with fiscal imbalances, can be a recipe for financial market crisis. Credibility comes from monetary policies aimed at achieving price stability, avoiding either general increases or decreases in price levels, and fiscal policies that avoid imbalances and unsustainable public sector debts or deficits. In short, in the absence of an external control on fiscal and monetary policy, policymakers and politicians must act with a great deal of discipline. However, we also know, particularly from Asia, that fiscal and monetary rectitude is not sufficient. Many of those countries had strong fiscal positions and sound monetary policy, but still suffered from international financial uncertainties. Microeconomic and financial policies must also be credible. The policies that are most responsible are those that allow free markets to work without undue intervention from governmental authorities. We have learned that governmental influence in private sector borrowing and investing decisions is to be avoided. This is all the more true because, as the research has shown, a larger portion of cross-border investment in the modern era appears to be in sectors in which financial performance and the valuation of assets are harder to gauge. In those circumstances any hint of governmental support is certainly to be avoided. Recently some have proposed the creation of “target zones” for major currencies, presumably in order to create, in part, greater stability in global financial markets. I think that these proposals are unlikely to work. We have seen on several occasions that the scale of resources required for the authorities to move exchange rates in directions not supported by the fundamentals, or by market perceptions of those fundamentals, can be enormous and effects in terms of exchange rate movements relatively small and transitory. If monetary policy is used to maintain the exchange rate within a target range, authorities at times may have to choose between maintaining domestic price stability and a currency target. Most monetary policy authorities recognize the value of maintaining price stability as one of the requirements for creating sound macroeconomic conditions for sustainable domestic growth. For the major countries, the benefits of an explicit focus on exchange rate stability, rather than price stability and sustainable growth, are not clear. I believe that policymakers should support greater transparency in the reporting of their own operations, as well as in private sector activities that I discussed above. Better and more timely information promotes better allocation of resources, including the global allocation of capital. The financial crises of the last several years indicate that greater transparency is needed on levels of external debt, including short-term foreign currency debt of the central government, and on levels of international reserves, including financial derivative positions. For transparency standards to be most effective, they require as broad a participation by authorities as possible. Further, they must balance the need for comprehensive and timely disclosure with the need for accurate data. Improvements in financial data from the private sector, which I have discussed, are crucial, but public sector disclosure must not await improvements in private sector disclosure. The IMF can contribute to public sector transparency with respect to critical economic variables through implementation of the Special Data Dissemination Standards, or SDDS. Voluntary compliance with SDDS by all countries will provide some assurance that economic transactions are not being based on misleading or incomplete information. The Need for Capital Mobility A third lesson suggested by the historical analysis is that the ability to attract foreign exchange is a critical element of a stable global financial system. In the modern context, I believe that this finding has two corollaries. First, we must avoid blocking capital flows, particularly of long-term funds. There have been policymakers who have advocated capital controls. I do not need to give a group of business economists a lesson in basic economics, but there are those outside of this room who need to be reminded of the benefits of free capital flows. Even though current international financial troubles indicate that there are risks from the free flow of capital, the benefits are substantial and contribute importantly to rising standards of living. As you know, an individual country’s pool of savings does not always equal its pool of investment opportunities. If borders were closed to the flow of capital, interest rates would adjust to equilibrate domestic saving and investment, but some investment opportunities, which might be attractive by international standards, might go unfunded while less attractive domestic opportunities would be funded. With the free flow of capital, countries that have a greater pool of highly productive investment opportunities will benefit from the inflow of capital and those that export capital can put their savings to better, more rewarding use. This presumes the institutional underpinnings outlined above. Protectionist impulses for measures that impede the flow of goods, services and managerial knowhow must also be avoided. While participation in such free exchange across borders creates industries and individuals that are made better off and some that do not fare so well, we know that, on balance, all societies are better off from allowing the free flows of international competition. Given the fact that there are scarce resources globally, it is important that all resources be employed to their comparative advantage, thereby increasing the stock of goods and services available to us all. Conclusion Let me conclude by saying that while the current era of globalization is quite distinct from the period prior to World War I, there are important lessons that can be learned from that earlier era. Some historians have argued that the pre-1914 era was notable for the size and relative stability of capital flows. Thus far our era has been notable for the size of international capital flows, but recently not for the stability. I do not doubt that there is room for improvement in the regimes that determine capital flows. However, I think that most of the solutions to the challenges that the world has faced require building institutional mechanisms and policy expertise that allow reliance on the strengths of the free market system. Improvements in policy discipline, information sharing, and institutional structure should all be evaluated by the litmus test of whether they enhance the flow of information to, and reduce bureaucratic interference with, markets and thus provide a safer and more stable global financial and economic environment.
|
board of governors of the federal reserve system
| 1,999 | 3 |
Speech by Edward W Kelley, Jr, a member of the Board of Governors of the US Federal Reserve System, to the Media Studies Center, New York NY, on 10 March 1999
|
Mr Kelly’s observations on Y2K compliance and the banking system Speech by Edward W Kelley, Jr, a member of the Board of Governors of the US Federal Reserve System, to the Media Studies Center, New York, New York on 10 March 1999 Good Morning. My thanks to the Media Studies Center and the Freedom Forum for inviting the Federal Reserve to participate in this meeting and, more important, for the initiative of sponsoring this gathering to discuss what I believe to be a critically important topic. As you may know, I have devoted a good deal of time to Y2K preparation over the last two years, and in that process I have talked to a wide spectrum of people. I have reached the conclusion that, now that the country is at last aggressively addressing the problem, the most important single element in our successfully navigating through the challenges presented by the ‘millennium bug’ is how the public responds to it. There are very few places in our national life where the statement that ‘attitude can affect outcomes’ is so compellingly demonstrated as in the banking sector, particularly as we work to address Y2K. You will no doubt agree that the best way to engender a strong and positive public attitude is through open and candid discussion, which I would like to encourage by briefly addressing three key questions. First, what are the critical components of public confidence in the banking system, and how are the industry, its regulators, and the central bank working to earn public trust in the Y2K context? Second, what are we confident about and what are we concerned about? When the media assesses the financial risk element of the Y2K story, these may be appropriate places to start. And finally, what do we hope the media will do in the coming months? I believe there are four important elements needed for public confidence in banks and the banking system, and each is being addressed as we prepare for potential Y2K problems. Bank customers need to be assured • of ready access to cash and other funds and bank services • that their bank records are secure • that competent and energetic government supervision is being conducted • that deposit insurance is in place and inviolate I want to elaborate on these elements because I believe that public understanding of the banking industry’s Y2K preparations will help maintain public confidence. First, cash availability. We do not see any need for a vast surge in demand for cash late this year, but if that should occur, it is the Federal Reserve’s responsibility to have available whatever may be required, and that responsibility will be met. An array of preparations are being made that will ensure currency will be available as needed in Federal Reserve vaults. The biggest challenge we face is getting it to the right place at the right time, and extensive advance planning is under way for that as well. Concerning the availability of other funds and bank services, including assurance that customer records are secure, the federal bank supervisory authorities – of which the Fed is one – are now completing their second inspection of every bank and thrift, some 10,600 institutions. Any problem banks, and all banks identified as keys to systemic safety, will receive additional scrutiny throughout this year. We are inspecting to be certain that precisely those matters I just mentioned are being addressed. The overwhelming majority of institutions are reported by our examiners to be doing a thorough preparatory job, and they will be fully ready well in advance of December 31, 1999. A competent and energetic supervisory presence is at work, providing not only oversight, but assistance and advice with testing, contingency plans, customer relations, and so forth. Last but not least, the FDIC is fully prepared, and depositors are dependably insured to up to $100,000 per deposit. No one can say there won’t be glitches, but all of this gives us a great deal of confidence that the banking sector will be ready. To repeat, we are confident that the overwhelming majority of banks are carefully and thoroughly preparing themselves and that, even if problems should arise, they can and will be readily handled, and that the financial system will not seize up or crash. A guarantee? No. An educated confidence? Yes. Our hope is that the media will carefully check out these assertions and, if they prove to be reasonable, report the story that way. There are concerns, of course. Even though banks themselves are ready, other essential parts of our infrastructure with which banks are interdependent may be less so. The challenges posed by Y2K to the smooth operation of our computer-driven society are so ubiquitous and interrelated that it is quite possible that not everything will be fixed that needs to be fixed in advance of December 31. More than anything else this is a race with time. There is so much work to do that there may not be enough time left for everyone to complete all of the tasks that they well know need to be completed. We don’t have as much information about preparations internationally, and some countries are apparently only now beginning to seriously address the issue. We do not believe this will seriously affect worldwide financial activity, but it could result in some disruptions abroad. These concerns make contingency planning crucial across all of our economy, but they are all matters that we know, at least in the technical sense, how to address. From our vantage point, another important concern is the possibility of our citizenry becoming so overly worried about what might happen that there could be created the very type of problem we are working so hard to prevent. Many people would like to have an ironclad guarantee that there will be no Y2K disruption, but that guarantee is simply not available. We cannot know in advance exactly how the millennium rollover will go. The truth is that no one can guarantee that everything will work perfectly even later this morning, but we do have every confidence that it will. That should be a message of reassurance, not of concern, and it is a good perspective on Y2K. Public confidence does not require that everyone believe that everything will always work perfectly. People know that is not the case. Rather the public needs to be confident that the information it is receiving is complete, reliable, and adequate enough for them to take actions appropriate to their own circumstances. If given this information, we have confidence that the public will keep the Y2K rollover in perspective, realizing it is one more challenge we as a nation will meet. In closing, let me spend a moment on my thoughts about what the media might consider as this story is reported over the next 10 months. This is a rapidly evolving situation, and demands of those who cover it an ongoing commitment to staying on top of developments, as facts here have a very short half-life. A given condition of preparedness observed even a few weeks ago is quite likely different today. No one should be a ‘Pollyanna’ about Y2K but, based on the huge amount of work being done to prepare, it is just not responsible to be a ‘Chicken Little’ either. We do not expect the sky to fall. Let’s not let ‘experts’ make unproved assertions, or offer personal opinions as facts, without challenging the basis for them. And let’s not treat isolated events as if they were a broadly general reality. There is much work to be done in local communities and we hope the media will be a powerful voice in urging the public to seek legitimate information, maintain perspective, and act with common sense. Suggest calm attentiveness to preparations. Provide ‘how to’ pointers and advice. Focus on contingency planning. Suggest and highlight sensible actions by small businesses. Encourage community leadership. Promote education about the problem. Hold opinion leaders and public services providers accountable for accurate, consistent flows of information. Accept that certain things are unknowable and do not agonize over that – we deal with uncertainties every day without loss of confidence. If glitches occur or problems loom, report fully on them of course, but make sure to place the problem in an appropriate context. Balance and perspective are key. Y2K gives reporters an extraordinary opportunity to do what they do best – learn the facts, weigh the evidence, and inform the public. You have a nation of attentive readers and listeners who very much trust your expertise, and who will prepare for the rollover with your perspective in mind. Complete and reliable information is every bit as important to our country’s success in preparing for Y2K as is all of the technical work being done.
|
board of governors of the federal reserve system
| 1,999 | 3 |
Speech by the Chairman of the Board of Governors of the Federal Reserve System, Alan Greenspan, before the Futures Industry Association, Boca Raton, Florida, on 19 March 1999.
|
Mr Greenspan discusses financial derivatives and the risks they entail Speech by the Chairman of the Board of Governors of the Federal Reserve System, Alan Greenspan, before the Futures Industry Association, Boca Raton, Florida, on 19 March 1999. By far the most significant event in finance during the past decade has been the extraordinary development and expansion of financial derivatives. This morning I should like to evaluate the scope of these markets, the nature of the risks they entail and some of the difficulties we encounter in managing those risks. At year-end, U.S. commercial banks, the leading players in global derivatives markets, reported outstanding derivatives contracts with a notional value of $33 trillion, a measure that has been growing at a compound annual rate of around 20 percent since 1990. Of the $33 trillion outstanding at year-end, only $4 trillion were exchange-traded derivatives; the remainder were off-exchange or over-the-counter (OTC) derivatives. The greater use of OTC derivatives doubtless reflects the attractiveness of customized over standardized products. But regulation is also a factor; the largest banks, in particular, seem to regard the regulation of exchange-traded derivatives, especially in the United States, as creating more burdens than benefits. As I have noted previously, the fact that the OTC markets function quite effectively without the benefits of the Commodity Exchange Act provides a strong argument for development of a less burdensome regime for exchange-traded financial derivatives. Of course, notional values are not meaningful measures of the risks associated with derivatives. Indeed, it makes no sense to talk about the market risk of derivatives; such risk can be measured meaningfully only on an overall portfolio basis, taking into account both derivatives and cash market positions, and the offsets between them. Clearly, the degree of counterparty credit risk on derivatives depends critically on the extent to which netting and margining procedures are employed to mitigate the risks. In the case of exchange-traded contracts, of course, daily variation settlements by clearing houses strictly limit, if not totally eliminate, such counterparty risks. In the case of OTC derivatives, counterparty credit exposures are far larger, though still a very small fraction of the notional amounts. On a loan equivalent basis, a reasonably good measure of such credit exposures, U.S. banks’ counterparty exposures on such contracts are estimated to have totaled about $325 billion last December. This amounted to less than 6 percent of banks’ total assets. Still, these credit exposures have been growing rapidly, more or less in line with the growth of the notional amounts. The leading role played by U.S. commercial and investment banks in the global OTC derivatives markets is documented in a Bank for International Settlements survey of last June. This survey estimated the size of the global OTC market at an aggregate notional value of $70 trillion, a figure that doubtless is closer to $80 trillion today. Once allowance is made for the double-counting of transactions between dealers, U.S. commercial banks’ share of this global market was about 25 percent, and U.S. investment banks accounted for another 15 percent. While U.S. firms’ 40 percent share exceeded that of dealers from any other country, the OTC markets are truly global markets, with significant market shares held by dealers in Canada, France, Germany, Japan, Switzerland and the United Kingdom. Despite the world financial trauma of the past eighteen months, there is as yet no evidence of an overall slowdown in the pre-crisis derivative growth rates, either on or off exchanges. Indeed, the notional value of derivatives contracts outstanding at U.S. commercial banks grew more than 30 percent last year, the most rapid annual growth since 1994. Although episodes of extreme volatility have produced declines in the most highly leveraged contracts, the growth of the more “plain vanilla” products has continued apace or even accelerated. The reason that growth has continued despite adversity, or perhaps because of it, is that these new financial instruments are an increasingly important vehicle for unbundling risks. These instruments enhance the ability to differentiate risk and allocate it to those investors most able and willing to take it. This unbundling improves the ability of the market to engender a set of product and asset prices far more calibrated to the value preferences of consumers than was possible before derivative markets were developed. The product and asset price signals enable entrepreneurs to finely allocate real capital facilities to produce those goods and services most valued by consumers, a process that has undoubtedly improved national productivity growth and standards of living. Nonbank, as well as bank, users of these new financial instruments have increasingly embraced them as an integral part of their capital risk allocation and profit maximization. It should come as no surprise that the profitability of derivative products has been a major factor in the dramatic rise in large banks’ non-interest earnings and doubtless is a factor in the significant gain in the overall finance industry’s share of American corporate output during the past decade. In short, the value added of derivatives themselves derives from their ability to enhance the process of wealth creation. While the value of risk unbundling has been known for decades, the ability to create sophisticated instruments that could be effective in a dynamic market had to await the last decade’s development of computer and telecommunications technologies. The ability to create and employ sophisticated financial products also galvanized the academic community to develop increasingly complex models of risk management. While recent history suggests that such models are useful, they are doubtless in need of much improvement – an issue to which I will return shortly. Yet beneath all of the evidence of the value of derivatives to a market economy, there remains a deep-seated fear that while individual risks seem clearly to have been reduced through derivative-facilitated diversification, systemic risk has become enlarged, as a consequence. Without question, derivatives facilitate the implementation of leveraged trading strategies, though the very technology that has made derivatives feasible has also improved the ability to leverage without derivatives. Nonetheless, the possibility of increased systemic risk does appear to be an issue that requires fuller understanding. We should point out, first, the obvious. Overall, derivatives are mainly a zero sum game: one counterparty’s market loss is the other counterparty’s market gain. Counterparty credit exposures on OTC derivatives are a different issue and the source of much of the systemic concerns. Such losses rose to record levels in the third quarter of 1998. Nonetheless, the rate of loss remained well below that on banks’ loan portfolios. Moreover, the counterparty credit losses in the third quarter can be traced primarily to the extraordinary events in Russia, which produced many defaults on ruble forward contracts. In the fourth quarter such losses dropped sharply, albeit not to the very low pre-crisis rate. The bulk of the losses reported by the major derivative houses for the financially turbulent third quarter of last year reflected declines in the market values of their underlying trading positions, especially in equities, commodities and emerging market debt. Derivative instruments were bystanders. They may well have intensified the losses in underlying markets, but they were scarcely the major players. Yet, through the past decades’ phenomenal growth of the derivative market, there has not been a significant downturn in the economy overall that has tested the resilience of derivative markets. (I operate on the premise that neither human nature nor the business cycle has been rendered obsolete.) While nothing short of a major economic adjustment is likely to test the underlying robustness of the derivative markets, there are reasons to believe that there are some fundamental strengths in these markets. First, despite the growing use of more exotic over-the-counter instruments, the vast majority of trades are relatively straightforward interest rate and currency swaps. The market risk on such swaps is presumably less daunting to individual counterparties than their underlying exposures, or presumably the swaps would never have been initiated. Moreover, the credit risks are increasingly subject to comprehensive netting and margin requirements that, although they do not fully remove the risk, significantly ameliorate it. And so far as banks are concerned, capital requirements are applied to such risks as they are to loans that create credit risks quite similar to those of derivatives. Hence, although one may harbor concerns about the overall capital adequacy of banks and their degree of leverage, there is little to distinguish such concerns between risk-adjusted onand off-balance sheet claims. The one area of risk that needs more thought is so-called potential future exposure. At any particular point in time only a small fraction of the notional value of derivative contracts are in the money – that is, have a positive market value. Because prices will doubtless change in the future, those contracts with negative or even positive values have the potential of larger positive values and, hence, a potential credit loss on default. That future potential for loss upon counterparty default will differ by the nature of the contract. For purposes of supervisory risk-based capital requirements, potential future exposure (over and above the current market value of derivatives, if positive) is currently estimated by separating derivatives into categories based on the underlying instrument (interest rate, exchange rate, commodity, equity, etc.) and the remaining maturity. The capital requirement is then derived by applying fixed factors to each category that reflect differences in the price volatilities of the instruments and the structure of the contracts. Interest rate swaps (70 percent of the notional value of OTC derivatives) have limited long-term loss potential, primarily because the contracts do not provide for an exchange of principal and the exposure is effectively amortized as interest payments are exchanged over the life of the contract. Foreign exchange, commodity and equity derivatives, of course, entail far greater exposures, either because principal amounts are exchanged or because the underlying’s price is more volatile. This approach to regulatory capital requirements is not altogether satisfactory. The most sophisticated derivative dealers parse their derivatives book in more detail. And certainly a single point estimate cannot capture the range of losses that might reasonably be experienced. Hence, in evaluating derivatives risk, far more stress testing of the lower probability outcomes is a necessity. Even a one in 500 occurrence does happen once every 500 times, and if that occurrence could threaten the franchise value of the derivatives counterparty it is an important concern for risk aversion. But we have to be careful of how we view these ostensibly low-probability events. They are low-probability only if we presume that the reality from which these events derive is best represented by a single bell-shaped probability distribution, be it a normal distribution or even a fat-tailed one. Modern quantitative approaches to risk measurement and risk management take as their starting-point historical experience with market price fluctuations, which is statistically summarized in probability distributions. We live in what is mostly a stable economic system in which market imbalances give rise to continuous and inevitable moves toward equilibrium resolutions. However, the violence of the responses to what seemed to be relatively mild imbalances in southeast Asia in 1997 and throughout the global economy in August and September of 1998 have raised the possibility of a discontinuous adjustment process. Almost all the time investors adopt strategies that seek profit only in a relatively long-term context, fostering the propensity for convergence toward equilibrium that ordinarily characterizes financial markets. But from time to time (and quite possibly with increasing frequency) the resulting propensity toward convergent equilibrium has given way as investors suffer an abrupt collapse of comprehension of, and confidence in, future economic events. Risk aversion accordingly rises dramatically and deliberative trading strategies are replaced by rising fear-induced disengagement. Yield spreads on relatively risky assets widen dramatically. In the more extreme manifestation, the inability to differentiate among degrees of risk drives trading strategies to ever more liquid instruments. Strategies become so tentative that traders want the capacity to reverse decisions at minimum cost. As a consequence, even among riskless assets, illiquidity premiums rise dramatically as investors seek the heavily traded “on-the-run” issues. History tells us that sharp reversals in confidence happen abruptly, most often with little advance notice. They are self-reinforcing processes that can compress into a very short time period. Panic market reactions are characterized by a dramatic shift to maximize short-term value, and are an extension of human behavior that manifests itself in all forms of human interaction – a set of responses that does not seem to have changed over the generations. I defy anyone to distinguish a speculative price pattern for 1999 from one for 1899 if the charts specify neither the dates nor the levels of the prices. If this paradigm turns out to be the appropriate representation of the way our economy and our financial markets will work in the future, it has significant implications for risk management. Probability distributions estimated largely, or exclusively, over cycles excluding periods of panic will underestimate the probability of extreme price movements because they fail to capture a secondary peak at the extreme negative tail that reflects the probability of occurrence of a panic. Furthermore, joint distributions estimated over panicless periods will underestimate the degree of correlation between asset returns during panics when fear and disengagement by investors result in simultaneous declines (or, in rare instances, increases) in values as investors no longer adequately differentiate among degrees of risk and liquidity. Consequently, the benefits of portfolio diversification will tend to be significantly overestimated by current models. Such a view of the world would also have important implications for approaches to the prudential oversight of capital adequacy for banks and other financial institutions. Regulatory minimum capital requirements for banks’ trading portfolios are now based on the banks’ own internal risk measurement models. Furthermore, regulators are exploring the potential for using an internal models approach to credit risk in the banking book. Some may now argue that the periodic emergence of financial panics implies a need to abandon models-based approaches to regulatory capital and to return to traditional approaches based on regulatory risk measurement schemes. In my view, however, this would be a major mistake. Regulatory risk measurement schemes are simpler and much less accurate than banks’ risk measurement models. Consequently, they provide banks with the motive and the opportunity to engage in regulatory arbitrage that seriously undermines the regulatory standard and frustrates the underlying safety and soundness objective. Specifically, they induce banks to reduce holdings of assets where risks and regulatory capital are overestimated by regulators and increase holdings of assets where risks are underestimated by regulators. It would be far better to provide incentives for banks to enhance their risk modeling procedures by taking account of the potential existence and implications of discontinuous episodes. Scenario analysis can highlight vulnerabilities to the kind of flights to quality and flights to liquidity that seem increasingly frequent. Stress testing of correlation assumptions can reveal the disappearance of apparent diversification benefits in such scenarios. Stress testing requirements already are part of the internal models approach to capital requirements for market risks in bank trading accounts. Stress testing of estimates of counterparty credit risks should also be required. The logic is the same as for market risk. The factors that are used to determine supervisory capital requirements for counterparty credit exposures are based on statistical analyses of non-panic periods. Moreover, during panic periods the usual assumption that potential future exposures are uncorrelated with default probabilities becomes invalid. For example, the collapse of emerging market currencies can greatly increase the probability of defaults by residents of those countries at the same time that exposures on swaps in which those residents are obligated to pay foreign currency are increasing dramatically. Supervisors should avoid any temptation to increase the supervisory factors for potential future exposure to address these crisis scenarios, which have vastly different implications for different combinations of contracts and counterparties. But they can and should review the requirements relating to the scenarios to be simulated by the bank and the incorporation of stress test results into the policies and limits set by the bank’s management and board of directors. As we approach the twenty-first century, both banks and nonbanks will need to continually reassess whether their risk management practices have kept pace with their own evolving activities and with changes in financial market dynamics and readjust accordingly. Should they succeed I am quite confident that market participants will continue to increase their reliance on derivatives to unbundle risks and thereby enhance the process of wealth creation.
|
board of governors of the federal reserve system
| 1,999 | 3 |
Remarks by Mr Edward W Kelley, Jr, member of the Board of Governors of the US Federal Reserve System, Owens Distinguished Lecture Series, Owens Graduate School of Management, Vanderbilt University on 25 March 1999.
|
Mr Kelley remarks on the challenges of the Year 2000 computer problem Remarks by Mr Edward W Kelley, Jr, member of the Board of Governors of the US Federal Reserve System, Owens Distinguished Lecture Series, Owens Graduate School of Management, Vanderbilt University on 25 March 1999 Thinking about Y2K I am delighted to be back on this campus again at the invitation of my old friend, Dr Dewey Daane, who served before me as a governor of the Federal Reserve. These are challenging times for economic policymakers, and there are many issues currently facing our nation’s central bank. But at the top of the list as a first priority is the Year 2000 computer problem, and that is our subject for today. Recently this topic has been receiving a great deal of attention and I’m sure that everyone here is familiar with the basic issue – specifically, that information generated by computers may be inaccurate, or that computers and electronic systems may malfunction because they cannot correctly process Year 2000 dates. With that stipulation, I will dwell no further on the nature of the problem itself, but rather attempt to focus on its likely economic impact. The economic stakes here are potentially very large and the spectrum of possible outcomes potentially very broad, ranging from minimal to very serious. For the truth of the matter is that this episode is unique: We really have had no previous experience with a challenge of this sort to give us reliable guideposts. Although the lack of a precedent may be unnerving, that certainly does not free us from the obligation to attempt to analyze how the millennium bug is affecting and will affect the economy. This economic puzzle has many complex pieces – some of them I fear quite inscrutable before the event – and my task today is to assemble for you as coherent as possible a picture of where the Y2K problem appears likely to take us. Please be forewarned that an assessment of this situation has a very short half-life, as conditions are evolving rapidly. The good news is that they are evolving favorably, the bad news is that the time remaining before the rollover date is passing by rapidly. Let me first turn to the reasons why Y2K has been so challenging. Then I shall discuss the actions that are being taken by both the public and private sectors to deal with the millennium bug and the effects these measures are having on economic activity at the present time. Finally, I shall turn – not without some trepidation, I might add – to an assessment of the spectrum of plausible outcomes for the millennium rollover as I see them. As you will shortly hear, I believe the Y2K alarmists have not fully recognized the attention that is being given to this challenge and to the significant progress that is being made towards meeting it. Given what we know today, I am cautiously but increasingly optimistic that the millennium bug will not cause major economic disruptions when it bites. And I am quite confident that the financial system will be well prepared. Why was it so difficult for so long for people to come to terms with the Year 2000 problem? At the most basic level of any organization – be it public or private, large or small – the Y2K problem was all too easy to ignore. It is a hidden threat, cloaked in the arcane language of computer programs and in embedded microchips. As such, it was difficult at first for senior management in both the private and public sectors to recognize the serious nature of the problem. This was compounded by the fact that the costs and benefits of the solution for an individual operation were neither very easily quantified nor very attractive. But after an initial period of denial, organization leaders in the United States have now recognized the problem and are taking aggressive action to correct it. Nonetheless, given the pervasiveness of the problems involved, I suspect that even the most thoroughly prepared organizations are concerned that something significant might be missed. Consequently, responsible and careful organizations are developing extensive contingency plans to work around any emerging problems as insurance against Y2K disruptions. Thus, insecurity about the comprehensiveness of Y2K remediation efforts is affecting corporate investment and production plans now and will do so into next year. The second feature of the millennium bug that makes it so difficult to analyze is the interrelated character of many computer systems. An individual company may be satisfied that it has done all it can to fix its own systems, but it may still feel vulnerable to the actions taken by its suppliers and customers. For example, what good would it do you to be perfectly ready if your electricity is off? Or if the train bringing in tomorrow’s production materials is delayed? In an environment where ‘just-intime’ inventory systems and electronic data interchanges have linked economic activities very closely together, one firm’s failure has the potential to ripple through significant segments of the chain of production, services, and distribution. Thus, coordination of Y2K remediation activities would have benefits for everyone. Of course, it is clearly impossible to coordinate the Y2K activities of millions of individual establishments. To help fill this void, numerous organizations have emerged as clearinghouses of information, and other institutions and consortiums are functioning as vehicles for system testing. The Federal Reserve, for instance, has established an extensive, separate and dedicated computer environment for the purpose of testing with its depository institution customers, and as we speak the securities industry is undergoing a five-week long series of tests to assess its ability to conduct business through its network of institutions. Bank supervisors, including the Federal Reserve, are holding their banks accountable for the effectiveness of their Y2K efforts, and I can assure you that there will be regulatory consequences for banks that do not fulfill their obligations. But many other organizations are on their own to test their critical systems with their key suppliers and customers. Because this situation is ubiquitous, complex, and fragmented, it is a very difficult task to quantify the aggregate costs of Y2K remediation. Similarly and perhaps more importantly, we also have no national scorecard on how effective our economy is being in our remediation efforts, and until quite recently very little national preparedness information at all. Under these circumstances, it is not hard to understand why the millennium bug is viewed as such as unpredictable phenomenon, and why it has attracted so much gloom and doom commentary. So, what is being done? The short answer is a great deal is being done. In January the President’s Commission on the Year 2000 issued the first of its promised quarterly assessments of the state of preparation in our country and internationally. More recently we have had helpful assessments from the Congress and various security analysts. From these and other sources, let me review for you our understanding of the status of efforts by the private sector, government entities, and the world community to deal with the Year 2000 problem. As far as the private sector is concerned, efforts to deal with the millennium bug have been steadily intensifying and are now proceeding very rapidly. I think it is a good sign that in the last few days articles have been appearing in the business press that the numerous Y2K remediation boutiques which sprang up over the past two years are beginning to experience a slowdown in activity and are starting to look for other things to do. In Congressional testimony last spring, I suggested that the private sector might spend approximately $50 billion between 1998 and 2000 to tackle the Y2K problem. This figure was based on research done by Fed economists and while our estimate of a ‘$50 billion bug’ still seems to be reasonable, I do expect this figure to move upward as we learn more throughout the year. I also perceive that the tools available to companies to address Y2K issues have increased substantially. Over the past months, most major computer hardware and software companies have released documentation of the Y2K readiness of their products on their websites. Similarly, most of the major computer publications now have elaborate ‘how-to’ guides on their web pages that will aid consumers and small businesses in their efforts to make their systems compliant. Commercial software producers have also been busy, and new software products are available to aid programmers in repairing code. I hope and believe people are availing themselves of these new resources. As far as depository institutions are concerned, I am encouraged by the progress that has been made over the past year, and there is every reason to be confident that our banking system will be ready. Based on ongoing reviews of all depository institutions as completed by federal banking regulatory agencies, the vast majority is making satisfactory progress in their Y2K planning and readiness efforts. Only a small percentage has been rated ‘needs improvement’ and well under 1% have been rated ‘unsatisfactory’. In these cases, the agencies have initiated intensive supervisory follow-up, including restrictions on expansionary activities by deficient organizations. For the rest of this year the agencies will be continually revisiting any institutions identified as having problems, as well as all those identified as being key to systemic health. While we can be confident institutions are addressing the problem, it is important to recognize that regulators cannot be responsible for ensuring or guaranteeing the Y2K readiness and viability of the banking organizations they supervise. Rather, the board of directors and senior management of banks and other financial institutions must be responsible for ensuring the institutions they manage are able to provide high quality and continuous service in January 2000. They have every motivation to do so – their survival is at stake. The Federal Reserve System has itself made great progress on Y2K issues, meeting the goals we set for ourselves. In addition to completing two rounds of reviews of the Y2K readiness of all banks subject to our supervisory authority, we have renovated and tested virtually all our own applications. As mentioned, we have opened our mission-critical systems to customers for testing with us and have progressed significantly in our contingency planning efforts. For the balance of this year, we will be focusing intensely on contingency planning, as we believe that is the best way to be ready to deal with any possible surprises. As in the private sector, activity to fix computer systems maintained by the federal government has been intensifying. Substantial progress has been made in many areas, but the President’s Commission agrees that much more work still needs to be done. Its reviews of federal Y2K programs have highlighted needed areas of improvement, as well as many other areas getting their preparation under good control. The Commission’s evaluation of every agency and department is publicly available in its quarterly summary, so I will not attempt to go through it chapter and verse. The current estimate of federal spending for all preparation is $6.8 billion. Last fall, legislation was enacted that facilitates the sharing of Y2K information among businesses and clarifies the legal liability of reporting it. All of these are positive developments. Far less is known about the effectiveness of the Y2K preparations by state and local governments. At the state level, a survey of web pages indicates that most states have extensive and impressive programs under way, but by recent count several states had no reference to Y2K preparedness at all and others were quite vague about what was going on under their control. We can identify $3.4 billion earmarked by states, but I am confident that number is low. While attention is often focused on highprofile systems such as the nation’s air traffic control systems and its electric power grid, there are many smaller, yet quite critical, electronically driven systems maintained by counties and municipalities that are also vulnerable. This would include such services as water, police, traffic control, and health and welfare activities. And as any Washington or Nashville commuter knows, one or two malfunctioning traffic signals can cause serious congestion, confusion, and delay, and the breakdown of traffic management systems could cause near total gridlock. I hope that local media in every area will increasingly focus attention on Y2K preparation and hold local leaders accountable for preparations to the infrastructure in their areas of responsibility. On the international level, there is both good news and bad. The governments of various industrialized nations have stepped up their own internal Y2K programs over recent months, and international cooperation is intensifying through efforts such as the Joint Year 2000 Council, chaired by my colleague Federal Reserve Governor Roger Ferguson. Most large multinational corporations report that they are well along in their own preparations worldwide and many of them are pushing their numerous local suppliers to be prepared to maintain the flow of materials and services. That is a significant positive step. The recent conversion to the euro was very smooth, thus proving that a job similar to the one we have at hand can be successfully accomplished. But that intense focus in Europe, along with other world financial troubles, has obviously been deflecting all too much attention away from Y2K issues. I worry that time will simply run out for some activities in some countries, particularly in the developing economies, and as a result risks exist for some level of disruption in various locales around the world. All of this has been affecting our economy in a variety of ways. On the positive side an important element in some Y2K programs is the accelerated replacement of aging computer systems with modern, state-of-the-art hardware and software. Such capital expenditures should raise the level of productivity in those enterprises, and addressing Y2K has increased the awareness of many senior executives of the complexity and importance of carefully managing their corporate information technology resources. The increased replacement demand also has contributed to the spectacular recent growth in this country’s computer hardware and software industries. A reverse effect, which I believe will shortly become visible, is that many institutions will ‘freeze’ their remediated and tested systems for the remainder of this year, effectively foregoing the installation of major new hardware and software systems. This moves some spending on technology forward from 1999 into 2000. So, ultimately, we are largely shifting the timing of these investment expenditures, rather than changing their total amounts very much. Another area in which uncertainty about Y2K readiness is likely to have noticeable effects in 1999 is in the management of inventories. As the millennium approaches, I expect businesses will want to hold larger inventories of goods as insurance against Y2K-related supply disruptions. Such a shift from ‘just-in-time’ inventory management to a ‘just-in-case’ posture is likely to prompt an increase in orders and production during late 1999, with these stocks subsequently being run off in the first half of 2000. We at the Fed, for example, will do precisely that in our management of the production of new currency. While Year 2000 preparation efforts may give a temporary boost to economic activity in some sectors, the probable net effect on the aggregate economy is slightly negative. Other than the obviously very valuable ability to maintain operations across the millennium, few quantifiable benefits accrue to most firms for their extensive remedial work. It is fair to think of Y2K as a huge one-time maintenance project, which is costly on balance and produces no additional product. Our estimates of the net effect of Y2K remediation efforts, on both our nation’s overall labor productivity and on real gross domestic product, are that it will likely shave one or two tenths of a percentage point off our growth rate, but a more substantial effect is possible if some of the larger estimates of Y2K costs turn out to be accurate. Let’s move on to the bottom line. Will every organization and everybody everywhere be fully prepared, so that everything will go off without a hitch? I seriously doubt it. As we have discussed, a great deal of work is already completed or planned to deal with the problem, but what if something does slip through the cracks, and we experience the failure of ‘mission-critical’ systems? How would a computer failure in one area of the economy affect the ability of others to continue to operate smoothly? How severe could be the consequences of Y2K problems emanating from abroad? The number of possible scenarios of this type is endless, and today no one can say with absolute confidence how severe any Y2K disruptions could be, or how a failure in one sector would influence operations in others. That said, let me now turn to a discussion of the spectrum of plausible outcomes for economic activity in 2000. What will happen as the millennium rolls over? A few economists are suggesting that Y2K-related disruptions will induce a deep recession. That probably is a stretch, but it is unlikely we will escape unaffected. I anticipate that there will be isolated production problems and disruptions to commerce, and perhaps some public services, that could reduce the pace of economic activity early in 2000. As mentioned above, at least a mild inventory cycle seems very likely to develop. But, just like the shocks to our nation’s physical infrastructure that occur periodically, I would expect the Y2K impact to our information and electronic control infrastructure is most likely to be short-lived and fully reversed. Most of us have experienced examples of how economic activity has been affected by disruptions to the physical infrastructure of some part of our country. Although the Y2K problem is clearly unique, and therefore the usefulness of any analogy is limited, analyzing some of these disruptions to our physical infrastructure may be useful in organizing our thinking about the consequences of short-lived interruptions in our information infrastructure. Many of us have experienced major bad weather episodes: a severe snow or ice storm, a flood, a tornado, or perhaps a hurricane. Commerce may grind to a halt for up to a week or so in an area, but activity bounces back rapidly once things are cleaned up. Although individual firms and households can be adversely affected by these disruptions, in the aggregate, the economy quickly recovers most of the output lost due to such storms. In these instances, the shock to our economic infrastructure is transitory in nature, and, critically, the recovery process is under way before any adverse ‘feedback’ effects are produced. Another similar example might be the strike not long ago by workers at United Parcel Service. UPS is a major player in the package delivery industry in this country, and the strike disrupted the shipping patterns of many businesses. Some sales were indeed lost but – and this is critical here – in most instances, alternate shipping services were found for high priority packages. Some businesses were hurt by the strike, but its effect on economic activity was small in the aggregate. In fairness it must be said that if disruptions that occur are not isolated events as I have assumed, but rather spread across key sectors of the economy by interacting with each other, then there could indeed be a more significant effect on aggregate activity in the first quarter of 2000. The more dire of the Y2K scenarios would involve, among other things, a perpetuation and intensification of these interactive effects and their subsequent feedback. Should this occur, production disruptions could turn out to be a national or international phenomenon and could spread from one industry to another. Under these circumstances, the decline in economic activity would prove to be longer lasting, and a recession could conceivably ensue. But let me quickly stress that I do not think that this recession scenario has a very high probability. It is possible, but a lot of things have to go wrong for it to occur, and much is being done to prevent its occurrence. Now you might appropriately ask a Fed representative what monetary policy can do to offset any Y2K disruption. The truthful short answer is ‘not much’. We can’t plow the streets or deliver packages and we would be unable to reprogram the nation’s computers for 2000. The Y2K problem is primarily an issue affecting the aggregate supply side of the economy, whereas the Federal Reserve’s monetary policy works mainly on aggregate demand. We all understand how creating more money, and lowering the level of short-term interest rates, gives a boost to interest-sensitive sectors, such as homebuilding, but these tools are unlikely to be very effective in generating more Y2K remediation efforts or accelerating the recovery process if someone experiences some type of disruption. We will, of course, be ready if people want to hold more cash on New Year’s Eve 1999, and we will be prepared to lend whatever sums may be needed to financial institutions through the discount window or to provide needed reserves to the banking system’s open market operation. And, in the unlikely event a serious Y2K disruption should have significant feedback effects on aggregate demand, such as I outlined earlier, there obviously would be a role for the Federal Reserve to play in countering a downturn. But there is nothing monetary policy can do to offset the direct effects of a Y2K disruption. In summary, as I stated at the outset of my remarks, I am cautiously but increasingly optimistic that the United States will weather this storm without major disruptions to economic activity. Some of the more frightening scenarios are not without a certain plausibility, if this challenge were being ignored. But it is not being ignored, as indeed this meeting today clearly illustrates. An enormous amount of work is being done in anticipation of the rollover of the millennium. As the world’s largest economy, the heaviest burden of preparation falls on the US. But it is truly a worldwide issue, and to the extent some are not adequately prepared and experience breakdowns of unforeseeable dimension, we could all be affected accordingly. We at the Federal Reserve intend to do our utmost, and we hope and trust others will do likewise.
|
board of governors of the federal reserve system
| 1,999 | 3 |
Testimony of Mr Laurence H Meyer, a member of the board of Governors of the US Federal Reserve System, on hedge funds before the Subcommittee on Financial Institutions and Consumer Credit of the Committee on Banking and Financial Services, US House of Representatives, on 24 March 1999.
|
Mr Meyer’s testimony on the Federal Reserve’s policy guidance to banks regarding hedge funds Testimony of Mr Laurence H Meyer, a member of the board of Governors of the US Federal Reserve System, on hedge funds before the Subcommittee on Financial Institutions and Consumer Credit of the Committee on Banking and Financial Services, US House of Representatives, on 24 March 1999. I welcome this opportunity to discuss the Federal Reserve’s supervisory actions in the aftermath of the near-collapse of Long-Term Capital Management (LTCM). Today’s hearings cover an important topic. The LTCM incident merits study to ensure that the lessons it provides are sufficiently understood and that constructive action is taken to effectively reduce the potential for similar events in the future, without compromising the efficiency of global capital markets. The primary issues raised by the LTCM incident appear to revolve around the broad theme of how to control the leverage and risk-taking of unregulated financial institutions – in particular, hedge funds – so that they do not become a source of systemic risk or jeopardize taxpayer funds via the federal safety net. In our market-based economy, the discipline provided by creditors and counterparties is the primary mechanism for “regulating” this risk-taking. In the case of LTCM, this discipline appears to have been compromised. Weaknesses in several key elements of the risk management processes at some creditors and counterparties were magnified by competitive pressures, resulting in risk exposures that may not have been fully understood or adequately managed. Less-than-robust risk management systems, evidenced by an over-reliance on collateral, compromised both the assessment of counterparty creditworthiness and the measurement and control of risk exposures at several financial institutions. To be sure, the lessons stemming from this episode have not gone unlearned, and there is no lack of effort to identify and implement appropriate public policy and private sector responses to the potential risks posed by hedge funds. These efforts range from private industry and supervisory initiatives aimed at strengthening the credit risk management infrastructures at financial institutions, to consideration of enhanced disclosure by global financial institutions, to those evaluating the costs and benefits of direct regulation of hedge funds. Efforts to promote market discipline by strengthening the risk management systems of creditors and counterparties offer the most immediate and efficient way to accomplish the desired objective of minimizing the potential for systemic risk arising from the activities of hedge funds. Supervisory oversight of bank risk management practices, including the issuance of guidance on sound practices, reinforces the market discipline entailed in banks’ assessment and surveillance of the risks taken by their counterparties. The recent guidance on sound risk management practices issued by the Basle Committee on Banking Supervision, the Federal Reserve, and the Office of the Comptroller of the Currency (OCC) represent significant steps toward achieving the goal of enhancing market discipline. I commend the Subcommittee’s efforts to advance public awareness of these efforts by holding today’s hearings on this recent supervisory guidance. Of course, public sector work on promoting more effective market discipline on hedge funds and other entities that might employ leverage is by no means complete. The guidance and other supervisory efforts we are discussing here today target primarily commercial banking institutions. Work underway by the International Organization of Securities Commissions (IOSCO) to issue similar guidance regarding securities firms’ relationships with hedge funds is another important step. Although not directly focused on the issue of hedge funds, international efforts to enhance public disclosure of financial institution risk profiles may also provide meaningful input. In this context, the recent consultative paper “Recommendations for Public Disclosure of Trading and Derivatives Activities of Banks and Securities Firms,” issued jointly last month by the Basle Committee on Banking Supervision and IOSCO, makes an important contribution to the discussion of possible public policy responses. In the United States, the President’s Working Group on Financial Markets is considering a number of issues and policy responses regarding leveraged institutions and their relationships with their counterparties. Its report is expected in the near future. Despite these various public sector initiatives, the real key to effective market discipline lies in the players themselves – the private sector. The market has clearly learned from the LTCM incident and our supervisory staff has seen significant tightening of credit standards on hedge funds as well as improvements in the risk management processes at major banking institutions. Here, too, much work remains. Accordingly, we look forward to the recommendations of the Counterparty Risk Management Policy Group (CRMPG) regarding private sector initiatives for enhancing the credit risk management practices of creditors and their leveraged counterparties. Subcommittees of this private industry group, comprised of major international banks, securities firms, and hedge funds, are investigating avenues for improving measures of derivative exposures and the exchange of information between counterparties. The findings of the group will reinforce the efforts to promote enhancement of risk management systems at banking institutions and are expected to advance sound practices in key areas such as the type of information that can be exchanged between hedge funds and their counterparties without compromising hedge funds’ proprietary information. Supervisory Efforts by the Federal Reserve in the Aftermath of LTCM In its role as a bank supervisor, the Federal Reserve’s primary contribution to advancing market discipline lies in its responsibility to ensure that the risk management processes at individual banking organizations are commensurate with the size and complexity of their portfolios. We promote the adoption of sound risk management practices through on-site reviews and targeted examinations of banking organizations and by regularly issuing supervisory guidance to both banks and our supervisory staff. This morning I will briefly summarize recent Federal Reserve efforts in both of these areas and will explain how Federal Reserve supervisory guidance provides direction to banking institutions and examiners that supports, and is consistent with, that issued by the Basle Committee on Banking Supervision and the Office of the Comptroller of the Currency. President McDonough’s testimony answers the Subcommittee’s questions regarding the recent guidance on highly leveraged institutions (HLIs) issued by the Basle Committee on Banking Supervision and the regulation of hedge funds in other developed countries. Immediately following the LTCM episode, the Federal Reserve detailed staff from the Board of Governors and the Federal Reserve Bank of New York to conduct special reviews at those state member banks with significant hedge fund relationships to identify: • the nature and magnitude of bank credit exposures to hedge funds; • the comprehensiveness of banks’ due diligence processes regarding hedge funds; • the quantitative controls used in managing exposures to hedge funds; • the adequacy of management information systems and internal controls with regard to hedge fund counterparties; and • the extent to which the LTCM relationship was an exception to banks’ normal hedge fund relationships. Our review found that U.S. commercial banking exposures to hedge funds are primarily counterparty exposures arising from OTC derivatives contracts. Overall, direct unsecured loans to hedge funds have traditionally been a small portion of bank lending, even at the larger global institutions. As of the third quarter of 1998, direct, unsecured loans disbursed by all U.S. commercial banks to hedge funds were estimated at $1.7 billion, or approximately one percent of Tier 1 capital of those banking institutions with exposures to hedge funds. This amount represents only direct lending arrangements and includes $170 million in loans dispersed by U.S. banks to LTCM under a $900 million shared national credit facility (most of which was contributed by foreign banking organizations). It does not include the $900 million of equity investments in LTCM made by three U.S. banking organizations in September of 1998. As of the third quarter of 1998, only five U.S. commercial banks had material OTC derivative exposures to hedge fund counterparties. Credit exposures arising from these relationships consisted of the current mark-to-market value of the derivative transactions as well as the potential exposure that might arise from future changes in these market values (the potential future exposure or PFE). All of the banking institutions mark their derivative positions to market on a daily basis and require any net current market value owed to them to be fully collateralized, generally with high quality securities, such as U.S. Treasuries or sovereign debt from G-10 countries. For those hedge funds judged to be of lower credit quality, banks generally require the posting of collateral or margin above current market values to protect against the potential future exposure of derivative contracts with these counterparties. With regard to LTCM, the review found that the fund was atypical among hedge fund counterparties in both the size of its positions and the amount of leverage it employed. While several hedge funds had larger net asset values (capital) than LTCM and a few funds may have employed the same or comparable book leverage, LTCM’s combination of size and leverage was singular. Investigations of the management of the LTCM account at several institutions found that an over-reliance on the collateralization of the current market value of derivatives positions and the stature of LTCM’s managers led to compromises in several key elements of the credit risk management process. In some cases, assessments of LTCM’s creditworthiness was found to be less than adequate as a result of limited information on the fund’s true risk profile and risk management capabilities. In particular, exposure measures and scenario analyses that could have identified potential losses under stress situations were found to be less than adequate. Importantly, while LTCM was found to be atypical among hedge fund counterparties, shortcomings in the risk management of hedge fund counterparty exposures appeared to extend beyond this one fund. In several cases, the review team found inadequate counterparty risk management policies and procedures. In others, while formal policies and procedures may have existed, gaps between policy and practices were identified. Specifically, the review team found that the due diligence and ongoing risk assessments of hedge funds were largely qualitative and lacked quantitative rigor. The review also found compromises in the limit systems and credit exposure measurement methodologies employed, including limited use of counterparty exposure stress testing. In particular, measures of the potential future exposures arising from derivative positions with hedge fund counterparties were found in need of significant enhancements at some banks. In general, banks placed undue reliance on the collateralization of current mark-to-market exposures and underestimated the potential exposure that could arise under difficult market conditions. The findings of this special review served as a primary source for the Basle Committee’s recent report, “Sound Practices for Banks’ Interactions with Highly Leveraged Institutions”. Federal Reserve staff played a major role in shaping the scope of the Basle documents, drafted significant portions of early versions of the Basle Committee’s main paper, and provided significant input into its sound practices paper. President McDonough’s testimony discusses, at length, the content of the Basle documents, including the sound practices they identify. The Board of Governors fully endorses both Basle documents. The Basle guidance has been incorporated in Federal Reserve guidance by direct reference in our recent Supervision and Regulation Letter, “Supervisory Guidance Regarding Counterparty Credit Risk Management” (S.R. 99-3). They have been transmitted by the appropriate Federal Reserve Banks to all state member banks and holding companies with significant hedge fund exposures and to all Federal Reserve staff supervising those institutions. Moreover, the March 1999 update to the Federal Reserve’s Trading and Capital Markets Activities Manual will incorporate the specific sound practices identified in the Basle documents in a special hedge fund subsection of its existing Counterparty Credit Risk Management section. The results of the targeted reviews conducted in the third and fourth quarter of 1998 have been shared with each institution reviewed, and supervisory plans tailored to each institution’s particular circumstances have been developed. Supervisory staff is monitoring each bank’s management of hedge fund counterparty exposures as well as the bank’s efforts to address any identified risk management shortcomings. I understand that other G-10 bank supervisors have translated the Basle documents into the appropriate foreign language and transmitted them to industry associations and/or institutions with hedge fund relationships. In some cases, supervisors have taken steps to monitor bank hedge fund exposures and bank initiatives to enhance internal counterparty credit risk management systems. Private Sector Response in the Aftermath of LTCM[is this a heading? If so, bold and followed by new para.] As would be expected coming out of the LTCM event and other market difficulties in 1998, banking institutions, in their own self-interest, appear to be well underway in making enhancements to their credit risk management systems. With regard to the due diligence process, banks are requesting and receiving more information from their hedge fund counterparties, such as value-at-risk calculations, position concentrations, aggregate off-balance sheet positions, and the results of stress tests. Banks have also increased the rigor of the due diligence processes applied to hedge fund counterparties, including the use of their own quantitative risk management specialists to conduct on-site reviews of hedge fund risk management systems. Increasingly, hedge funds recognize that they need to provide their counterparties with more information. All parties are looking for remedies short of having funds disclose specific position information that they feel might compromise the integrity of their proprietary investment strategies. It is expected that a major contribution in this area will be made by the Counterparty Risk Management Policy Group. Banks are also moving to develop more realistic counterparty credit risk exposure measures including the development of various types of stress testing of their credit risk exposures to major counterparties. Some banks are reviewing their policies regarding how, when, and with what type of counterparties they will require collateralization of potential future exposures. In general, all of the banks reviewed last year have conducted their own internal assessments of lessons learned and, in their own self-interest, are reassessing their business strategies regarding hedge funds and moving forward to make necessary enhancements to their risk management processes. Federal Reserve Board Guidance on Counterparty Credit Risk Management Federal Reserve supervisory guidance that is particularly pertinent to issues surrounding bank relationships with hedge funds was first issued in the Federal Reserve’s Trading Activities Manual (TAM), published in 1994. This manual discusses general sound practices for managing the market, credit, legal, liquidity and operating risks involved in bank trading and derivatives activities. The manual also provides guidance in other areas such as accounting, capital requirements, financial performance measurement, ethics and regulatory reporting and compliance. It also provides over 35 individual instrument profiles that describe the risks and supervisory issues involved in each product. Over the years, this manual has come to serve as a definitive industry resource on sound risk management practices as they relate to trading and derivative activities. Revision of the guidance in this manual is an ongoing process. The manual was substantively revised in 1998 and is updated each March and September. In 1994, the Federal Reserve also issued specific guidance focusing on hedge funds. Both this specific guidance and our manual emphasize the importance of sound financial analysis of counterparties that can quickly adjust their risk profiles. In reviewing the 1998 financial performance of large banking institutions, a number of general lessons on how, where, and why breakdowns in risk management processes can occur have been re-emphasized to both banks and bank supervisors. As has been the case in most instances of bank losses, competition, the pursuit of earnings, and the general press of business often result in the introduction of risk exposures for which existing risk management infrastructures may not be sufficient. Moreover, breakdowns in risk management most often arise in product, customer, and business lines that experience significant growth and abovenormal initial profitability. In an effort to emphasize the importance of some of the general lessons highlighted by events over the past two years and to advance the application of these lessons in the interests of avoiding future difficulties in other areas, the Federal Reserve issued its supervisory letter on counterparty credit risk management on February 1 of this year to provide general guidance. The guidance is aimed at providing supervisors and bank management insights on those elements of counterparty credit risk management systems at large complex banking organizations that may need special review and enhancement in light of the rapid changes taking place in banking and financial markets. The guidance is targeted at relationships with all types of bank counterparties, including hedge funds. It reiterates and expands upon fundamental principles of counterparty credit risk management that are covered in existing supervisory materials of the Federal Reserve and other regulators, and in established industry standards. It emphasizes areas that, while generally understood for several years, have become increasingly important given the global linkages of financial markets. In particular, the important inter-relationships between market and credit risks and their effect on the magnitude of derivative counterparty exposures, especially in times of stress, is an increasingly important area that merits the attention of all banks engaged in derivative activities. Accordingly, this issue is discussed at length in our recent guidance. From a broad perspective, the guidance advises banking institutions to focus sufficient resources on ensuring the adequacy of all elements of their counterparty credit risk management systems, especially for activities, business lines and products experiencing significant growth, above normal profitability or risk profiles, and large potential future exposures. Recognizing that strong internal controls and internal audit functions are the first line of defense in avoiding problems, the guidance also advises institutions to ensure that internal audit and independent risk management functions focus on growth, profitability and risk criteria in targeting their reviews. Institutions are also advised to calibrate their credit risk management policies and procedures to the risk profiles of specific types of counterparties and instruments. Too often, general policies and procedures developed to cover all types of counterparty exposures can lead to important gaps in the assessment of risks to specific types of counterparties. The guidance specifically addresses four basic elements of counterparty credit risk management systems: the assessment of counterparty creditworthiness; credit risk exposure measurement; the use of credit enhancements and contractual covenants; and credit risk exposure limit-setting and monitoring systems. With regard to the assessment of counterparty creditworthiness, the guidance points out the need for policies and procedures that are tailored to the risk profiles of counterparties and for internal controls that ensure actual practices conform with these policies. In complying with this guidance in the context of their hedge fund relationships, banks are expected to have specific policies for assessing the unique risk profiles of hedge funds, including the scope of due diligence analysis and ongoing monitoring to be conducted, the type of information required from hedge fund counterparties, and the nature of stress testing used in assessing credit exposures to hedge funds. As mentioned above, the Federal Reserve has adopted the Basle Committee’s recent guidance on sound practices governing bank relationships with hedge funds and expects that banks’ internal policies regarding their hedge fund relationships will be brought into compliance with those sound practices. In the area of exposure measurement, the Federal Reserve’s guidance also points out that potential future exposure measures are becoming more important in managing the credit exposures of derivatives positions. Accordingly, institutions must ensure that potential future exposures for both secured and unsecured positions are measured realistically and are better incorporated into measurement and limit systems. It also advises institutions to step up existing programs to enhance credit risk exposure measures by incorporating netting and portfolio effects. The need for better stress testing and scenario analysis of credit exposures that incorporates the interaction of credit and market risks is also identified. In essence, the guidance points to the need for a better balance between the qualitative and quantitative elements of exposure assessment and management for all types of counterparties. Conformance of Federal Reserve Supervisory Guidance with Other Supervisors The development of supervisory guidance on sound risk management, like industry practices, is an evolutionary process enhanced by experience. It could be argued that, to a large extent, the fundamental principles of assessing counterparty credit risks, the measuring and stress testing of the potential future exposures of derivative positions, and the dangers of overreliance on collateral have been well documented in supervisory guidance for several years. However, given advances in technology and the increasing pace of financial innovation and market interdependency, the techniques and means used to implement these principles are under constant development and refinement. Accordingly, supervisors must endeavor to ensure that their guidance is as up-to-date as possible. As mentioned above, our most recent guidance both re-emphasizes and supplements existing Federal Reserve Board principles and guidelines. Although different supervisors start from different bases of existing guidance, we believe the current body of Federal Reserve guidance on the risk management of trading, derivatives, and other capital markets activities is entirely consistent with that issued over the years by the Basle Committee on Banking Supervision and by other U.S. bank regulators. The recent guidance released by the Basle Committee, “Sound Practices for Banks’ Interactions with Highly Leveraged Institutions”, covers the same material and provides the same direction to supervised institutions for a specific type of counterparty as that addressing all types of counterparties contained in existing Federal Reserve guidance. Moreover, as was mentioned above, the specific Basle guidance has been fully incorporated in the soon-to-bereleased updates to our Trading and Capital Markets Activities Manual. In addition, existing Federal Reserve guidance is also consistent with that issued by the OCC. In its most recent supplemental guidance to Banking Circular 277 and the Comptroller’s Handbook for National Bank Examiners, the Comptroller identifies 13 lessons learned from events over the past two years. Although Federal Reserve guidance on trading and derivative activities may use different formats, it conveys the same direction and sound practices to supervised institutions embodied in each of these 13 lessons. For example, the Comptroller’s recent guidance discusses the need for senior management and the board of directors to understand the limits of their price risk measurement systems and goes on to emphasize the need for stress testing such exposures. Supervisory guidance of the Federal Reserve has long advised of the importance of stress testing market risks and the conveyance of these reports to senior management and the board of directors so that they can fully understand the institution’s risk exposure and adjust risk tolerances accordingly. Our most recent guidance on the measurement of potential future exposures and stress testing supplements this prior guidance. Perhaps the most importance guidance emphasized by the Federal Reserve and the OCC is that which advises banks and examiners to ensure that sufficient risk management is targeted at new, growing, and highly profitable activities. As mentioned above, such areas have been the source of most bank losses. In summary, the Federal Reserve believes that its existing supervisory guidance on trading and derivatives activities at state member banks and bank holding companies is entirely consistent with, and complementary to, that of the Basle Committee on Banking Supervision and the OCC. Together, this supervisory guidance offers a clear set of sound practices that, when implemented appropriately, serves to enhance and support market discipline by strengthening the risk management processes of major creditors and counterparties. Supervisory Lessons Learned[heading? If so, bold and followed by new para.] Events in developing and developed financial markets and the various types of losses posted by banking institutions over the past two years, including recent events surrounding bank hedge fund relationships, have also provided supervisors and examiners with important lessons. From one perspective, we would like to think that effective supervision contributed to the ability of U.S. institutions to weather the financial storms of the past two years. Our reviews indicated and the financial results illustrate that, while the LTCM incident and other episodes over the past two years may have significantly impacted earnings, they did not threaten the solvency of any U.S. commercial banking institution. Still, our review of our own performance suggests room for enhancements on our part. Within the context of the Federal Reserve’s risk-focused approach to supervision, major counterparty exposures are generally reviewed during both regular and targeted reviews of banks’ derivatives and counterparty credit risk systems. Our internal reviews found several cases where examiners, like banking institutions, may have placed too much emphasis on the full collateralization of current exposures. In the past, examiners have generally focused supervisory resources on assessing the risks entailed in unsecured credit exposures. Moving forward, our guidance instructs examiners to incorporate measures of potential future exposure in stratifying samples and selecting counterparties and transactions upon which to base targeted testing of practices and internal controls, regardless of the collateralization of current market value exposures. Examiners are also instructed to review the results and adequacy of an institution’s stress testing and scenario analyses in assessing both the magnitude and management of credit exposure. The need to emphasize in-depth transaction testing is another important supervisory lesson learned (or relearned) in the LTCM case, and this is emphasized in our supervisory guidance. The increasing complexity of financial markets and banking activities places a premium on focusing supervisory resources at high-risk areas and conducting sufficient transaction testing to identify variances between policy and practice. Increasingly this involves conducting transaction testing with highly qualified specialists. Targeting resources at retaining, recruiting and developing such specialists as well as providing them with automated tools to enhance their efficiency and effectiveness is a top supervisory priority at the Federal Reserve. Closing In closing, I would like to emphasize the significant amount of attention that the LTCM incident, in particular, and bank relationships with hedge funds, in general, have received, and continue to receive, from both public and private venues. Although market discipline may not have worked in preventing the LTCM event in the first place, the marketplace has reacted appropriately and we have learned much to carry us forward. Banks and securities firms, in their own self-interest, have tightened their risk management processes as they relate to hedge funds. Hedge funds now face a new reality of tougher counterparty oversight. Supervisors are also enhancing their oversight of banks’ hedge fund exposures. The supervisory guidance issued by the Basle Committee, the OCC, and the Federal Reserve represents an effective, quick, and needed response to an important issue. This guidance effectively reinforces private sector initiatives to enhance counterparty credit risk management processes. As I mentioned at the outset, even more work needs to be done to ensure that the lessons we have learned over the past two years become engrained in standard practice and to ensure that effective market discipline is brought to bear on the risk-taking of hedge funds and other entities that make use of significant financial leverage. In particular, we look forward to the reports and recommendations of the Counterparty Risk Management Policy Group that will provide additional practical tools for implementing both industry and supervisory sound practices in counterparty credit risk management.
|
board of governors of the federal reserve system
| 1,999 | 4 |
Remarks by Mr Roger W Ferguson Jr, a member of the Board of Governors of the US Federal Reserve System, before the National Automated Clearing House Association in Atlanta, Georgia on April 13, 1999.
|
Mr Ferguson looks at the challenges to the US payments system posed by the Year 2000 problem Remarks by Mr Roger W Ferguson Jr, a member of the Board of Governors of the US Federal Reserve System, before the National Automated Clearing House Association in Atlanta, Georgia on April 13, 1999. The Payments System and Year 2000: Current Preparations and Challenges Thank you for your invitation to speak today on the timely and critical topic of the Year 2000 challenge to the payments system. As an industry, we have a tremendous responsibility to the public to ensure the smooth operation of the payments system through the Year 2000 and beyond. This morning I would like to discuss the global challenge presented by the Year 2000 problem and describe international efforts, particularly those of the Joint Year 2000 Council, to assist the worldwide financial services industry to be ready for the century date change. In particular, I would like to focus on contingency and risk management initiatives undertaken by central banks and the state of readiness of Federal Reserve payment systems, such as Fed ACH. Finally, I will discuss our need to recognize our responsibility to the general public and our need to continue to develop cooperative efforts between public-sector and private-sector organizations. Global Challenge With less than a year left before the millennium change, it still remains difficult to judge its global impact. The good news is that we are not faced with inaction and there are early indications that the financial industry, in general, is working diligently to be prepared. The euro conversion, which I will elaborate on later, demonstrates that the financial industry can manage significant changes to its systems. It is also important to note, however, that the Year 2000 deadline is not a single event; it is arriving in stages based on dates typically used in financial systems that give us an “early warning” of potential problems. As an example, for any financial systems that look ahead one year, the early January 1999 dates provided a test of their Year 2000 readiness. During 1999, other application look ahead dates will provide further indications of our readiness as well as our ability to initiate our contingency plans. Given the sheer number of computers and chip-based systems, and the manual nature of the fixes that have emerged to date, we are likely to experience some degree of disruption. I believe most disruptions will prove to be mild and short-lived and, as reported in a recent article, their severity effect will be labeled as “headaches, not disaster.” Consequently, I do not think that we will face global recession. Ultimately I believe the number and extent of computer system disruptions will not be used as the measure of our success in addressing the Year 2000 problem. Instead, our success will be measured by our ability, and the public’s confidence in our ability, to conduct business operations effectively for the balance of this year, on January 3, 2000, and thereafter. Current State of International Year 2000 Efforts Some foreign governments and institutions started their Year 2000 program years ago, while others were initially reluctant to recognize that a problem even existed. This reluctance hindered efforts to take early action. In some areas, already limited resources were dedicated to other activities, such as the euro conversion or attempts to restructure weakened economies. Those countries that recognized the problem early and had sufficient resources assembled technical experts from whatever source was available. Recognizing these challenges, in April 1998 the Bank for International Settlements hosted a Year 2000 roundtable to raise international awareness. At the close of the conference, the sponsoring organizations, the Basle Committee on Banking Supervision, the G-10 Committee on Payment and Settlement Systems, the International Association of Insurance Supervisors, and the International Organization of Securities Commissions established the Joint Year 2000 Council. The Council was formed as a direct result of the recognition of the complexity of the global financial services industry and the need to communicate proactively with regulatory and supervisory authorities. As Chairman of the Joint Year 2000 Council, I work with other members of the Council to ensure that a high level of attention is given to the challenge within the global financial supervisory community and to serve as a point of contact with national and international private sector initiatives. To that end, the Council operates with the participation of central banks, insurance and securities regulators, and banking supervisors. It is the first international body that brings together such a range of financial regulators. It is important to note that the Joint Year 2000 Council is not the “international Year 2000 enforcement agency.” We do not have on-site examiners, nor are we meant to replace the efforts of national regulators. The Council established an External Consultative Committee, or ECC, to enhance the degree of information sharing between the public and private sectors. This committee includes representatives of internationally-oriented organizations including the Global 2000 Coordinating Group, the International Monetary Fund, the World Bank, and financial service providers, such as S.W.I.F.T., Euroclear, and Cedel. Our most important role may be to provide a forum to facilitate information sharing and cooperation among supervisors. To accomplish this, we have developed a global supervisory contact list of over 600 financial regulators and have initiated several mechanisms for communicating with them. Our most visible may be a series of bulletins, which come out monthly, on different themes and topics. Additionally, the Council has, and will continue to, support, cosponsor, and provide assistance in planning conferences and roundtables on the Year 2000 challenge. To date, we have conducted meetings for regulators from Europe, Asia-Pacific, North and South America and the Caribbean, Africa, and the Middle East, which have provided an excellent means of bringing supervisors together to discuss common interests within specific geographic areas. What is the state of “international readiness?” As you can understand, it is difficult to measure accurately the level of international Year 2000 readiness, and certainly no one can predict with confidence exactly how the century date change will unfold internationally. There are three reasons for this difficulty. First, there is no single indicator that can be used to judge the overall Year 2000 readiness of any country, including the United States. Second, the state of readiness is a moving target. Judgements are usually based on anecdotal information obtained either first-hand through interviews or surveys or second-hand through other sources. The best information is subject to change. Finally, while it may be possible to reach conclusions about the Year 2000 readiness of individual industry sectors, it remains difficult to assess the interdependency of critical systems across industry sectors. In addition, the state of Year 2000 readiness of a country’s public sector may not be an accurate indicator of the status of its private sector. With that said, I believe that, as with the United States, in most countries the financial sector was probably somewhat ahead of other sectors in recognizing the Year 2000 problem, and is probably somewhat more advanced in testing and business continuity planning. The banking agencies here in the United States have recently finished Phase II evaluations. Of the banks that the Federal Reserve supervises, about 95 percent are making satisfactory progress. Of importance, the same number of 95 percent applies to foreign banking organizations operating here, which as a group had been behind their U.S. counterparts. I can also report that, through the regional meetings of regulators that I referred to earlier, we have now had contact with regulators from over 100 countries, and the degree of awareness among regulators is, I find, uniformly high. Similarly, with respect to regulators, I should mention the recent release of a statement by the Basle Committee on Banking Supervision discussing the results of its second supervisory survey on the Year 2000 issue. The statement highlights the need for continued efforts, especially in several smaller jurisdictions. It is noteworthy, however, that the Committee’s survey of more than 100 regulators identified significant progress within the global financial community. A large majority of regulators indicated that they have issued specific Y2K-related guidance to their banks and believe their banks are taking appropriate steps to address the issue. I believe that these results illustrate the benefits of global supervisory cooperation in raising the financial industry’s awareness of the issue and in spurring action. Overall, no one can know with certainty the global impact of the century date change. While much good work is being done, much remains. Banking institutions, both domestic and foreign, must not allow complacency to set in. Lessons Learned As I noted earlier, there were concerns last year that scarce programming resources in Europe were primarily focused on the euro conversion and that Year 2000 readiness efforts lagged. The euro conversion is completed and its success can best be measured by its perception by the media and public as a non-event. While the scale of system changes involved in the euro conversion was significantly less than the Year 2000, the characteristics of the problem are similar and the Global 2000 Coordinating Group conducted a workshop on the euro conversion to identify “lessons learned” that could improve Year 2000 preparedness. The results are valuable and I would like to use them as a framework for describing payment system readiness efforts underway at the Federal Reserve. The first lesson was the need for suppliers of liquidity and credit management to offset payment and funding dislocations. Central banks were urged to accept responsibility for these issues and be able to make decisions with ”less than perfect information.” The Federal Reserve has taken steps to increase its currency order and to be prepared to provide any necessary liquidity over the year-end through open market operations and discount window lending. We are reminding depository institutions that the Federal Reserve is available to meet liquidity needs under appropriate circumstances and encouraging those that are including the Federal Reserve, as a lender of last resort, in their Year 2000 liquidity contingency plans to complete necessary discount window documentation and collateral arrangements as soon as possible. We are also meeting with agencies such as the Federal Housing Finance Board and the National Credit Union Administration to discuss liquidity issues that might affect the institutions they supervise. Another lesson of the euro conversion was the need to provide for timely and accurate information sharing between industry participants and with regulators. Effective information sharing structures, such as communication centers, must be set up that enable fast and reliable communications between market participants; among commensurate organizational levels, from senior management through operations; and with regulators. Although the Federal Reserve has demonstrated its ability to manage crisis situations, we are evaluating and augmenting our existing communication structures where appropriate. We are developing local and national event management centers to maintain a consistent flow of information within the Federal Reserve, with our business partners, and with the public at large. Regarding this last point, we are in the process of improving our information sharing with our most critical supplier, the telecommunication industry, through our participation in the Telecommunications Sector Group of the President’s Council on Year 2000 Conversion and links with the Federal Communications Commission (FCC). The Joint Year 2000 Council is undertaking initiatives that will improve cross-border information sharing between countries, particularly plans for business resumption and event management. The third lesson is the need to expand contingency planning beyond operations, to address business continuity, coordinate planning with third-party providers, and test these plans. We established a Century Date Change Council, comprised of senior management officials, to direct the Federal Reserve’s approach to business continuity and contingency planning for the Year 2000. The Council considers risk mitigation actions to lessen the likelihood of a problem occurring and to reduce the impact if a problem should occur; and oversees development of contingency response procedures in the event problems do occur. The Federal Reserve’s planning for Year 2000 business continuity is well underway. Each business completed an assessment of the adequacy of its existing contingency scenarios for Year 2000 problems last June. In addition, last November each Federal Reserve entity servicing customers documented its plans to respond to the possibilities of customer problems. Although the responsibility for having sound contingency plans lies with the management of the depository institutions, we recognize that Federal Reserve preparations to deal with potential customer problems are an important component of our business resumption planning. Our emphasis now is on continuing to monitor, and where possible, reducing risk and on refining the Federal Reserve’s plans through testing, independent reviews, and other means of coordination. Lastly, and not surprisingly, many problems that did occur with the euro conversion could have been identified with more thorough testing. Since mid-year 1998, the Federal Reserve has offered customers the opportunity to test future-dated transactions for payment applications, such as Fed ACH, and other services with electronic data exchanges. We are continuing to offer testing opportunities both during the week and on weekends through early 2000 and, in the second half of 1999, opportunities will be provided to revalidate application readiness and to test contingency procedures. As of the beginning of April 1999, over 6,200 institutions had tested with Fed ACH and approximately 94 percent of our larger-volume ACH customers have tested. I am also encouraged by the testing efforts of private service ACH operators. In February, all ACH operators exchanged files future dated to the rollover weekend and a second test is scheduled for May 10. The Federal Reserve and NYACH will test using leap year dates this weekend. We strongly support domestic and international testing efforts by major market participants, such as the New York Clearing House, the Securities Industry Association, and S.W.I.F.T. to test critical data exchanges. Under the leadership of the New York Clearing House and with the support of the Federal Reserve, cross-border testing of major payment transactions and S.W.I.F.T. is planned to take place in June 1999. For all countries participating in this test, June 12 will conform to January 3, 2000; and June 13 will conform to January 4, 2000, the first business day on which all major international money markets are open after the century rollover. Public Outreach The Joint Year 2000 Council strongly recommends enhanced information sharing as critical to readiness efforts and to avoid unnecessary uncertainty in financial markets. In many countries, there is currently a lack of adequate information on Year 2000 readiness. This is of concern because it impedes efficient preparations by market participants and may exacerbate negative perceptions in the marketplace. Financial institutions should share information in order to strengthen confidence that the Year 2000 challenges are being met in all financial sectors worldwide. In February 1999, the FFIEC issued guidance to financial institutions on how to address customer expectations. It emphasizes that maintaining customer confidence in the financial services industry now and after the Year 2000 needs to be a top priority of the institution and its senior management. In 1999, our outreach program will focus on business resumption activities, including Year 2000 risk mitigation and contingency response actions. The first outreach program occurred in December 1998, when officials from the Federal Reserve presented, via videoconference, an interactive session with its depository institution customers on the Federal Reserve’s contingency planning for the Year 2000. Banks are urging their regulators and other government officials to promote public confidence by speaking clearly and forcefully about Year 2000 preparedness. Regulators can play a constructive role in making sure that Year 2000 information disseminated to the public is factually accurate, balanced, and broadly distributed. Communications are a shared responsibility. Research tells us that customers want to hear directly from their bank and that the public is interested in specifics, not generalities, about our preparations. The Federal Reserve is expanding its program of outreach, with every Reserve Bank working on an aggressive program of local communications. I urge you to do the same. The public needs to know that the Year 2000 is like other challenges we have met through careful preparation, risk reduction, and contingency planning to allow us to work around any problems that might occur. Public Obligations and Patterns of Behavior As we get closer to the century date change, there will no doubt be more sensational coverage in the media. It is our obligation to be smart consumers of information and to listen to responsible, not alarmist, voices. Remember, as with anything that has a degree of uncertainty, there will always be those who predict the most dire outcomes. They have generally been wrong in the past, and I expect that they will be wrong again. There are likely to be some disruptions from the century date change; nothing this large and complex can be perfectly faultless. We should remember, however, that there have been serious disruptions to service in daily life before, from storms, temporary power outages, disruptions of telephone service, etc. In general, these prove to be annoying and inconvenient, but nothing more. We also need to maintain reasonable and responsible patterns of behavior. It is possible that market participants may reduce clearing and settlement activity on December 31 and January 3. Several participants, however, have indicated plans to reduce activity over a substantially longer period, up to a month. While such a reduction may appear to be prudent to an individual firm, such action by multiple participants raises the possibility of thin markets with accompanying erratic price behavior. Our perspective has been to encourage market participants, especially those who hold themselves out as intermediaries, to consider carefully the overall implications, including for market liquidity, of reductions in trading volume so that risks and benefits of such initiatives are appropriately balanced. I understand that some organizations that normally make payroll or other payments using the ACH are thinking about whether they should make these payments by cheque in the days immediately following the rollover to the Year 2000 as part of their contingency planning. I hope these organizations carefully consider all of the implications of such a step before making a final decision in this regard. We have a high degree of confidence in the ability of the Federal Reserve and the banking industry to continue to process electronic payments during this period. The drawbacks to the payment recipients of this change to business-as-usual practices may well outweigh any perceived benefits. Conclusion I believe that the financial services industry has made great progress in addressing Year 2000 issues. During this next year you will not only need to do your best to continue to repair and test your own systems, but will also need to evaluate the risk of potential failures and the effect of these failures on your business. Further, you will need to test fallback procedures or work-around processes that mitigate the effect of such failures on your ability to continue to conduct business. Finally, we must continue to be a reliable source of accurate and sound information to maintain the public’s trust. I do not doubt that we can collectively rise to these challenges.
|
board of governors of the federal reserve system
| 1,999 | 4 |
Remarks by Mr Laurence H Meyer, a member of the Board of Governors of the US Federal Reserve System, at the European Institute's Conference 'Challenges to the European Millennium', in Washington, DC on 26 April 1999.
|
Mr Meyer discusses the euro in the international financial system Remarks by Mr Laurence H Meyer, a member of the Board of Governors of the US Federal Reserve System, at the European Institute’s Conference “Challenges to the European Millennium”, in Washington, DC on 26 April 1999. I am very pleased to be here today with such a distinguished group of European officials to speak about the monetary adventure that commenced in Europe earlier this year. These are challenging days for policymakers, with or without a new currency to worry about. Our task today is to celebrate and speculate. We celebrate an important international event, the creation of the European Economic and Monetary Union, and with it the birth of a new currency, the euro, and a new supra-national central bank, the ECB. We recognize the leadership this spring of Germany as President of the European Union. And we speculate about the implications of monetary union and the euro — for the euro area, for the United States, for the world economy, and especially for the international monetary system. In my remarks today, I would like to touch on four issues. First, the role of an international currency and the likelihood that the euro could displace the dollar to some extent over the years ahead; second, the pros and cons associated with having a preeminent international currency; third, the role that the exchange rate plays in monetary policy formulation; and, finally, the implications of the euro for international monetary cooperation. Today marks the 81st trading day of the euro since its debut on January 4. Since its introduction, Europe’s single currency has depreciated against the U.S. dollar by nearly 9 percent. The consensus seems to be that the euro’s depreciation has resulted largely from a slowing in growth relative to expectations in the euro zone, while, at the same time, growth in the U.S. has been stronger than expected. If this is the case, and I believe it is, then the movement of the euro has simply mirrored what would have occurred in the currencies of the euro area in the absence of monetary union. It also stands in sharp contrast to the speculation, in the years leading up to the euro’s debut, that there might be a massive and rapid reallocation of investor portfolios out of dollars and into Europe’s new currency. Some experts estimated that such a reallocation of official and private portfolios could total in the range of $500 billion to $1 trillion. Such a massive reallocation would, of course, have resulted in an appreciating euro and depreciating dollar in the early stages of monetary union. We now know that, at least in this initial period, rapid portfolio reallocation does not seem to have occurred and the euro’s exchange value relative to the dollar has moved exactly contrary to what some had predicted. The macroeconomic factors — the outlook for growth and interest rates — so far have dominated the new currency’s movements, just as macroeconomic factors were the driving force behind movements in the euro’s chief predecessor currency, the German mark. But the absence of dramatic near-term consequences should not lead us to conclude that this bold adventure will have negligible consequences for the euro area, for the global economy or for the United States. Rather it should remind us that the consequences of the euro for international financial relationships are likely to evolve over some considerable period of time. The consequences might include a more competitive and healthy euro-wide economy, a greater role for the euro as an international currency, a growing role for euro-denominated assets in global portfolios more generally, and changes in the global pattern of exchange rates and current account balances. For example, if economic policies in Europe are sound, the dollar’s international status would likely diminish somewhat and the euro’s international status would increase, reflecting the prosperity of the euro-area economy and its role in the global marketplace. The international status of a currency involves a number of different functions that correspond generally to the classic uses of money as a unit of account, a means of payment, and a store of value. Since the end of the Second World War, the dollar has been preeminent in the global financial system in terms of the currency in which contracts are denominated and paid, official reserves are held, and financial instruments are issued. But, just as the German mark increased in international importance and eroded slightly the dollar’s status, so is the euro likely to continue this trend. Inertia and so-called network externalities are key to explaining why a shift away from the dollar would not occur overnight. Inertia probably explains why it is that until relatively recently a number of traded commodities were quoted and priced in sterling. If the dollar is used in enough different ways in the international arena, then it is cheaper to use the dollar rather than some alternative — I will use the dollar simply because you and everyone I do business with uses the dollar — that is, there is a network externality associated with using the dollar. But the international financial consequences of the euro will be driven most importantly to the extent euro-denominated assets play a growing role in global portfolios, displacing to some degree the role of dollar-denominated assets. Such an expanded role for euro-denominated assets would depend upon and follow from the development of a broad, liquid euro securities market to rival the dollar securities markets. But I am talking only of potential. A growing and ultimately significant international role for the euro is not automatic. Instead, the euro area will have to earn its place in the international financial system, and earn it the old fashioned way — by pursuing policies that produce a healthy euro-area economy. By improving price discovery within the euro zone, the euro itself may increase competitiveness across Europe, and thereby benefit low-cost producers, trade competitiveness, and European consumers. But the key to the success of the euro area would still appear to be the same set of policies that would have been essential in France, Germany, and other euro countries in the absence of the euro: structural reforms, especially those related to reducing the rigidities in European labor markets, and disciplined monetary and fiscal policies. Does the euro make structural reforms and disciplined policy more or less likely? Frankly, I do not know the answer to that question. Fortunately, we do not live in a zero sum world. The structural reforms and monetary and fiscal policies that would make the euro area stronger and more competitive would certainly enhance the strength of the world economy as well. Whenever the world economy is strengthened, the U.S. benefits. However, this should not lead us to understate or neglect the significant changes in trade flows, current account balances, and exchange rates that might be part of the transition to an increased international role of the euro, and the importance of careful management of this transition. It is only fair, however, to warn my euro friends that having a preeminent international currency has its pluses and minuses — it is not an unmitigated free ride, as some of the discussion has suggested. There is a burden that accompanies having an international currency. Policy changes that are undertaken for domestic purposes can have important effects on others, effects that are magnified when a country or region is not only a large part of the world, in terms of goods production and trade, but also a large part of the global financial system and a foundation of the international monetary system. For example, Federal Reserve policy in the early years of Paul Volcker’s chairmanship involved a large increase in short-term interest rates designed to reign-in high domestic inflation. For countries in Latin America that had borrowed heavily in dollars at floating interest rates, the change in U.S. monetary policy had enormous consequences. Thus, when a country’s currency is widely used internationally, the international spillovers to domestic policy actions can be intensified, so that domestic policy actions quickly seem to become everyone’s business and can generate criticism in international circles. I might note that we at the Federal Reserve steadfastly resist the call to be the world’s central bank. We are the central bank of the U.S. and Congress has given us a quite specific and narrow mandate: promote price stability and full employment at home, period. While their mandate is not identical to that of the Federal Reserve, the ECB will have to wrestle with the same issues, and increasingly so, as the euro becomes a major player in the international monetary system. On the other hand, having an international currency can provide substantial benefits. The most direct benefit is seignorage revenue. With about $300 billion of U.S. currency notes in the hands of foreigners, the United States earns roughly $15 billion per year (less than 0.2 percent of GDP) in seignorage. With the euro as another major currency in the world’s economy — and, in light of the potential attractiveness of euro notes when they are introduced in 2002 to countries in eastern Europe and former Soviet states — we could stand to lose a considerable fraction of the interest-free loan that the rest of the world provides to us. This will, obviously, not be a major event for us and I would hope it is not the major benefit for the euro countries of the birth of the euro! Some believe that the major benefit from the United States having the premier international currency and the broadest and most liquid financial markets is that we are able to finance our large current account deficit more cheaply than would otherwise be the case. That is true only to the extent that the international status of the dollar brings lower interest rates on dollar liabilities. This suggests that international currency status might affect a country’s (or in the case of the euro, the region’s) equilibrium real exchange rate and real interest rate. But looking at exactly the same phenomenon in another way, those who seek international currency status should not forget that an appreciating currency — and one that is very often a safe haven in times of economic and political crisis — implies a reduction in competitiveness and a drag on economic growth through a declining current account balance. As the international role of the euro increases, the demand for euros will increase — both directly as currency and indirectly via euro-denominated assets — potentially leading to an appreciation of the euro. I say potentially because we all know that the price of anything, including the euro, is affected by supply as well as demand. And the euro will encourage an increase in both the issuance of eurodenominated liabilities as well as in the demand for euro-denominated assets. As a result, there is some uncertainty about the effect of portfolio decisions on the price of the euro. Even if the euro appreciates, this consideration suggests, the magnitude of the appreciation may not be very large. But let us nevertheless explore the implications of a net increase in the demand for euros and hence an appreciation of the euro over time, given that this is the result many have emphasized and one with particularly important implications for the international financial system. The effect of a large reallocation of portfolios away from the dollar and dollar-denominated assets and into the euro and euro-denominated assets would, by definition, imply a swing from current account surplus to deficit for the euro area as investors acquire net claims on the region. Indeed, the appreciation of the euro would be the cause of the euro-area’s move from surplus to deficit. But such an appreciation of the euro would bring a reduction in competitiveness that could provide additional pressure for structural reform in continental European product and labor markets. And the counterpart to this from our side would be a reduction in net claims on the United States, and might be just the medicine the doctor ordered to help us move to a more sustainable external position. I’d like to say a few words about how monetary policymakers think about the exchange rate when judging the appropriate level for domestic short-term interest rates. As you know, the American economy is relatively closed when compared with individual European countries — U.S. exports of goods and services, for example, are only 14 percent of our GDP, compared with Germany where exports are 31 percent of GDP. With monetary union, the euro area’s openness to trade with countries outside the single currency region becomes more similar to the United States. For U.S. monetary officials, the dollar through its implications for U.S. net exports and the growth of real GDP as well as for inflation is one of many important economic variables that is input into policymaking. The dollar is not, in-and-of itself an explicit goal for policy: The dollar and its performance are very, very important for the United States. But the Federal Reserve sets its target for the Fed funds rate taking into account all the performance variables in the economy and does not try to target the exchange rate. The European Central Bank now confronts a similar situation relative to the euro’s value. The depreciation of the euro since its introduction should provide a boost to net exports and put some upward pressure on inflation in the euro zone. The ECB must weigh the effects of the euro’s depreciation along with many other factors affecting the region’s economy in judging the appropriate policy stance. Some experts have alleged that the Federal Reserve — and now the ECB — treat the exchange value of their currency with benign neglect. I have some problems with this characterization. In the United States, we have had a system of flexible exchange rates since the breakdown of the Bretton Woods system in the early 1970s. By definition, a flexible exchange rate system is one in which policymakers have complete discretion over monetary policy. That is, we do not devote monetary policy to the maintenance of a particular value for the exchange rate. That is not benign neglect. That is the implication of a system in which we retain total flexibility for monetary policy actions in order to best stabilize the domestic economy, of which the international sector is an important component. The importance of maintaining the focus of monetary policy on domestic objectives has clear implications for the suggestion that international monetary cooperation be directed toward the maintenance of target zones for exchange rates. Some have suggested that now that we have three major currencies — the dollar, the euro, and the yen — it may be easier to achieve such cooperation. The importance of maintaining monetary policy focused on domestic objectives, however, rules out, in my view, assigning monetary policy the objective of maintaining exchange rates within target zones. Frankly, given the volatility of international capital flows, I doubt we could accomplish such an objective even if we assigned monetary policy with the responsibility to enforce such an agreement. And, as I said, we should not even want to try to do so. What we should want is a system that permits the United States, the euro area, and Japan to pursue policies to achieve domestic objectives, without generating wide and unsustainable swings in exchange rates. But cyclical swings in exchange rates and trade deficits that naturally arise from cyclical divergences across the countries should neither be avoided nor regretted. These movements are potentially stabilizing and part of the benefits of a flexible exchange rate regime. We have seen that very clearly recently. Monetary policy responses to cyclical developments tend to reinforce the cyclical swings in exchange rates, and these exchange rate movements are very much part of the stabilizing power of monetary policy. So, in general, these movements should not be avoided or regretted. However, when cyclical swings are of longer duration or deeper than normal, fiscal policy must come into play. Fiscal policy adjustments allow a country to meet its stabilization objective more by bolstering domestic demand and correspondingly less by encouraging external demand at the expense of other countries. In addition, in times of global economic stress, relying on cyclical swings in the exchange rates to stabilize individual countries amounts to relying on a redistribution of demand across the countries rather than on policies which raise global demand. Changes in structural fiscal deficits — bringing changes in national saving rates and in relative real interest rates around the world — are a major contributor to non-cyclical swings in exchange rates and current account balances. Divergent fiscal policies may generate movements in exchange rates and current account balances that are large and unsustainable, subjecting the global economy not only to an inevitable reversal of these movements, but to the threat of a disorderly correction. More dialogue among the dollar, euro and yen countries might have produced a set of policy mixes across these countries that avoided some of the wider swings in exchange rates that we have witnessed during the past 25 years. We have learned much from this period, and will all work toward putting our knowledge to good use. Let me conclude once again by congratulating the euro area on a remarkably smooth birth of their new currency and new central bank. I personally wish the euro area great success with this bold initiative.
|
board of governors of the federal reserve system
| 1,999 | 4 |
Remarks by Mr Laurence H Meyer, a member of the Board of Governors of the US Federal Reserve System, at the Ninth Annual Hyman P Minsky Conference on Financial Structure at the Jerome Levy Economics Institute, Bard College, Annandale-on-the-Hudson, New York on 22 April 1999.
|
Mr Meyer considers the role of exchange rate policy, macroeconomic policy, banking supervision and regulation Remarks by Mr Laurence H Meyer, a member of the Board of Governors of the US Federal Reserve System, at the Ninth Annual Hyman P Minsky Conference on Financial Structure at the Jerome Levy Economics Institute, Bard College, Annandale-on-the-Hudson, New York on 22 April 1999. Structure, Instability, and the World Economy: Reflections on the Economics of Hyman P Minksy This paper has its origin in a request by Don Brash, Governor of the Reserve Bank of New Zealand, to present a central banker’s perspective on the Asian crisis to a group of Southeast Asian central bankers. So the central banker’s perspective remains an organizing theme. Central banks have two core missions: the pursuit of monetary policy to achieve broad macroeconomic objectives and the maintenance of financial stability, including the management of financial crises. The latter mission is closely connected to regulation and supervision of the banking system, so I include this within the central banker’s perspective, as well as broader issues related to systemic risk in the financial sector. Central banks also often have or share with Finance Ministries control over exchange rate policy, including the choice of an exchange rate regime and the management of that regime. So, today, I consider the role of exchange rate policy, macroeconomic policy, and bank supervision and regulation in the crises and suggest some lessons in each case. As I was writing the paper, it became clear that my interpretation of the sources of, and appropriate policy responses to the crises among the Asian emerging economies, drew heavily upon the work of Hy Minsky. Perhaps that should not be surprising since Hy and I were colleagues for more than two decades at Washington University. But the truth is, in many respects, Hy and I came from different worlds. My highly traditional background in economic theory was in rather stark contrast to Hy’s self-proclaimed war on neoclassical economics. While it is true that I never lost my commitment to traditional models — not a surprise to those who still hear me talk about the critical importance of the NAIRU framework to understanding inflation dynamics — I have often found words coming out of my mouth that reflect the distinct and powerful influence that Hy has had on my thinking. The truth is, there are few who have influenced my thinking about economics more than Hy. Indeed, he had so much to offer that if I only accepted a small dose, it was still enough to be a powerful complement, and perhaps antidote, to my otherwise conventional upbringing. Hy’s analysis of the sources of financial crises — his “financial instability hypothesis”1 — is the foundation for my interpretation of the sources of the Asian crisis. In addition, his work on how policies and institutions in advanced capitalist economies have evolved over time to mitigate the risks and attenuate the effect of financial disturbances — as developed in “Can It Hyman P. Minsky, “The Financial Instability Hypothesis: An Interpretation of Keynes and an Alternative to ‘Standard’ Theory,” Nebraska Journal of Economics and Business, winter 1977. Happen Again”2 — is central to my discussion of how to mitigate the risks of such serious financial and banking crises in the future. I. Sources Recessions, in general, and especially when accompanied by financial crises, are the product of a coincidence of adverse shocks on an already vulnerable economy. External shocks which would have been shrugged off by a robust economy can lead to seemingly disproportionate declines in economic activity when they fall on an economy characterized by excessive leverage, speculative excesses in asset markets, poor risk management, and inadequate regulation and supervision in the banking sector. The adverse shocks that appeared to trigger the crises included the slowdown in export revenue due to a slump in the semiconductor market; the slump in Japan in the spring of 1997, which removed a source of demand for the region; and the appreciation of the dollar relative to the yen which undermined international competitiveness in the region. These shocks — individually and collectively — did not seem large enough to account for the dimension of the crises, thus, the importance of understanding the vulnerabilities that I believe were instrumental in transforming a series of modest shocks into disproportionate effects on these economies. Hy’s work focused particularly on the endogenous nature of evolving vulnerabilities. Indeed, he often viewed his major contribution as the explanation of the upper turning point in the business cycle. I have often described his views as suggesting that “stability is destabilizing.” That is, that a period of stability induces behavioral responses that erode margins of safety, reduce liquidity, raise cash flow commitments relative to income and profits, and raise the price of risky relative to safe assets — all combining to weaken the ability of the economy to withstand even modest adverse shocks. This is, at least in my interpretation, the substance of Hy’s “financial instability hypothesis.” In the case of the Asian emerging economies, there was evidence of speculative excesses in financial and real estate markets in some of the countries. There was, in addition, an extraordinary taking-on of risk in the form of enormous leverage in the non-financial sector and in the financing of longer-term domestic investment projects with shorter-term foreign denominated borrowing. The failure to respect risks was not only evident in financial markets and financing practices, but also in the investment decisions themselves. These risks were compounded by poor risk management and inadequate bank supervision and regulation. It should be noted, however, that not all the countries were affected by all of these vulnerabilities or to the same degree. Financial sector vulnerabilities often increase during a cyclical upswing, as Minsky emphasized so often, setting the stage for the subsequent downturn. But in the case of the Asian developing economies, there was also a systemic source of these vulnerabilities: weaknesses in corporate governance and moral hazard associated with implicit or explicit government guarantees. The result was incentives for excessive risk taking. To understand the dimension and spread of the crisis among Asian developing economies, we also have to take account of the vulnerability generated by fixed exchange rates in the Hyman P. Minsky, “Can ‘It’ Happen Again,” in Dean Carson, ed., Banking and Monetary Studies, Homewood, Illinois: Richard D. Irwin, 1963. presence of volatile international capital flows, the role of market psychology, and the role of contagion effects. Financial sector weaknesses, pegged exchange rate regimes and volatile capital flows combined to yield a highly combustible mixture that, with the spark of adverse shocks, resulted in the igniting of currency and debt crises, including the collapse of banking systems throughout the region. The result was both a particularly sharp economic downturn and significant obstacles to recovery, specifically the joint problem of restructuring of the banking systems and resolving the excessive debt in the nonfinancial corporate sectors. The dramatic declines in currency and equity markets in this case were also affected by the sharp swing in market psychology. In part due to a lack of transparency, markets had a hard time sorting out what the fundamentals dictated in terms of exchange rates and equity prices. That made the markets very sensitive to factors that affected confidence in the policies followed by the countries. This meant that prompt and decisive policy action in advance of IMF programs was very important, and that a perception of government commitment to IMF programs, once in place, was imperative. Hy’s work helps us to bring a balanced perspective to the debate that still rages about the Asian crisis. Was it due to vulnerabilities in the Asian economies or was it an illustration of the inherent instabilities of global capitalism? Hy, I expect, would have concluded that the answer is both. Capitalism, in its domestic or global form, brought great potential for higher living standards, but also the potential for instability, including occasional financial and banking crises. The key was to maximize the opportunity to take advantage of the benefits, while mitigating the risks. Still, it is important to appreciate the interplay between developments in the industrial countries and in the emerging market economies leading up to the crisis. The weakness in Japan certainly took its toll on the emerging Asian economies. The extraordinary inflow of capital into emerging Asian economies from the industrial countries contributed to possible overheating and set the stage for the abrupt and dramatic reversal of capital flows that was a defining feature of the crises. Contributing to the surge in capital inflows to the region were shortfalls in risk management by financial institutions in these countries, misperceptions about the riskiness of such investments, and attempts to diversify portfolios in these economies following a run-up in domestic equity prices. In “Can It Happen Again?” Hy argued that advanced capitalist economies have found ways to mitigate the risks of financial and banking crises, or at least attenuate their adverse effects. Hy emphasized the evolution of the central bank’s role as lender of last resort and the stabilizing role of a large government as the central features of this policy and institutional evolution. I’ll take a somewhat broader view of the nature of the policy and institutional evolution of capitalist economies and, in turn, of the structural reforms that would mitigate risks of future crises in the emerging market economies. This broader view might also extend to the appropriate evolution of international financial institutions and cooperation to keep pace with the increasingly global form of capitalism. The importance of robust institutions and sound policies in mitigating the risks associated with inherent instabilities in capitalism suggests a role for policy “sequencing” in emerging market economies. It is widely argued, for example, that capital account liberalization in emerging market economies should be preceded by improvements to the institutional infrastructure to make the economies less exposed to risks associated with the volatility of capital flows. These include both appropriate exchange rate and financial regimes. But, in fact, we seem to only play lip service to such an optimal sequencing of policies. Some worry, perhaps with reason, that sequencing might become an excuse for not moving ahead with capital account liberalization. What we really seem to encourage is rapid liberalization, independent of the state of the banking and financial sector, hoping that financial liberalization will pressure the authorities to move more quickly with improvement in supervision and regulation. The Asian crisis is, I believe, a test of this approach. At the very least, we have to match the pace of capital account liberalization with careful consideration of exchange rate regimes and efforts to improve corporate governance and bank regulation and supervision. The sequencing perspective also suggests that the story behind the crisis in emerging Asian economies may have less to do with the inherent instabilities of global capitalism than with a mismatch between the evolution of institutions and policies and the pace of liberalization of financial markets and the capital account, the critical entry points to global capitalism. What may be in play, therefore, are the transition costs of a rapid increase in globalization, and especially transition costs associated with entry of emerging market economies into the global economy. A third theme in my interpretation of the Asian crises is perhaps a lesser focus in Hy’s work, but he was nevertheless quite prophetic in relation to the recent crises. Hy warned that the ability of a central bank to act as a lender of last resort is limited to debts denominated in the country’s own currency.3 When countries finance their domestic projects with foreign denominated debt, therefore, they lose the stabilizing potential of their central bank’s lender of last resort power and confront a far more challenging and potentially unstable environment. In the case of the Asian crisis, the financing of domestic projects with foreign denominated debt — either directly or through the banking system — created an important vulnerability, one that was dramatically aggravated by the sharp depreciation of the currencies in the crisis countries, and one that domestic central banks had limited power to arrest. So, what are the lessons from this framework for thinking about recessions in general and the Asian crisis in particular? It would be tempting to encourage countries to avoid adverse shocks. But of course, shocks are, by definition, unavoidable. To be sure, risks can be avoided or mitigated by limiting vulnerabilities. It is especially important not to become complacent during a period of excellent macroeconomic performance about the underlying strength of balance sheet positions, debt-income ratios, credit quality, quality of bank credit risk management, and adequacy of prudential supervision. This experience only reinforces the wisdom of the adage that “bad loans are made on good times.” Normal times may also be opportunities to transition from to more flexible exchange rate regimes. But, to an important degree, there is an almost inexorable tendency for vulnerabilities to build to some degree during expansions. Therefore, another key lesson is the importance of policies and institutions that mitigate the risks that evolving vulnerabilities will trigger serious crises. This episode emphasizes the importance of robust institutions — such as exchange rate regimes, bank Hyman P. Minsky, “The Potential for Financial Crises,” in Tamir Agmon, Robert G. Hawkins, and Richard M. Levich, eds., The Future of the International Monetary System, Lexington, Massachusetts: Lexington Books, 1984. regulation and supervision, and corporate governance — as well as sound policies in promoting good economic performance. II. Exchange Rate Policy Pre-crisis policy: the case for flexible exchange rates Many countries have tried to run exchange rate regimes that fall somewhere between fully flexible exchange rates and “very fixed” exchange rates, meaning a well-designed currency board arrangement or even, in the extreme, dollarization. However, arrangements between the extremes are often difficult to sustain indefinitely and when such arrangements break down, the result can be very painful. Whether or not currency boards are a viable option remains controversial. Such arrangements may increase the durability of fixed exchange rate systems, but perhaps at great expense to the real economy. Therefore, I conclude that one of the lessons from the Asian crisis is that a flexible exchange rate regime is, in general, preferable to pegged exchange rate regimes as a means of minimizing vulnerability to adverse shocks. Exchange rate policy during currency crises In principle a devaluation or float of the exchange rate, by allowing the exchange rate to reach a more sustainable level, should lead to a subsequent easing of interest rates and other financial pressures. But, during the Mexican crisis of 1994-95, and the more recent crises in Asia and Russia, devaluations have served to intensify downward pressures on financial markets: currency values plummeted, interest rates skyrocketed, capital outflows intensified, and economic activity dropped off sharply. The adverse consequences of devaluing or floating during speculative attacks represent all the more reason for countries to exit from pegged exchange rate regimes into more flexible regimes during periods of normalcy. If a country has failed to exit from its pegged exchange rate regime during normal times and is confronted by a speculative attack, then the key question becomes whether and when to abandon the peg. The answer depends on whether or not a successful defense is possible. If the country’s position is strong enough — i.e. the financial sector is sound, output gaps are not already large, and foreign exchange reserves are large — to avoid devaluing during a financially volatile period, it probably should endeavor to do so through some combination of monetary tightening, structural reform, and foreign exchange rate intervention. Defending the peg in this way may entail costly increases in interest rates and declines in economic activity, but these costs might be substantially less than in the alternative case of an uncontrolled devaluation spiral. Of course, this leaves the key practical problem of identifying the probability that a peg can be defended. This is an extremely difficult proposition, even for a completely objective analyst. Not-so-objective players, such as national governments, have often been excessively optimistic about their chances of defending a peg. And, it was also the case, in this episode, that the pegs were not strongly defended during the early stages of the crisis. The increases in interest rates were too timid, and the willingness to take other pre-emptive moves to restore investor confidence too limited. Conversely, recent experience could suggest that, in the face of a speculative attack, an exchange rate peg should be abandoned as soon as it is clearly unsustainable. The sooner the peg is abandoned in this circumstance the better, since the government is likely to have more reserves remaining, financial institutions will have incurred fewer losses from high interest rates, the maturity structure of the debt will have had less time to shorten, and expectations are less likely to have galvanized around the exchange rate. Still, the lessons from this period are not always so clear. Indonesia and Malaysia gave up their pegs within a month after the Thai baht floated, but suffered comparable consequences to Thailand. Another lesson from this episode is that early devaluations are not a cure-all. III. Macroeconomic Policy Pre-crisis macroeconomic policy By conventional standards, the monetary and fiscal policies of the developing Asian economies prior to the crisis were largely disciplined and appropriate. In all of these countries, consumer price inflation — the prime metric for the success of monetary policy — was relatively subdued, especially by emerging market standards. By the metric of public sector deficits, fiscal policy also appears to have been disciplined prior to the crisis. Therefore, another important lesson of the Asian crisis is that sound macroeconomic policies alone do not preclude crises. This experience also suggests that sound macroeconomic policy must be complemented by sound financial practices, effective bank supervision, and effective corporate governance. I suspect, however, that Hy might have raised a serious question about this favorable assessment of pre-crisis policy. There was, as I noted earlier, some evidence of speculative excesses in financial and real estate markets in some of the countries and, despite the relatively good inflation performance, an argument could be made that the speculative excesses were evidence of overheating and could have been remedied by macroeconomic policy. Higher interest rates, on the other hand, would have encouraged still more capital inflows and appreciation of the currencies at a time of increasing current account deficits. Fiscal restraint would have, in retrospect, been desirable, but at least on the spending side, would have to be weighed against the substantial infrastructure and other priorities. While the inflation performance was good by developing economy standards, it was consistently higher than inflation in the U.S., the country to which exchange rates were pegged. As a result, there was a tendency toward real appreciation, which contributed to the deteriorating current account deficit in several of the crisis countries. Monetary policy during the speculative attack While monetary policies may not have been inappropriate in the years prior to 1997, they were probably not tightened sufficiently or for long enough in the immediate pre-devaluation phase of the emerging crises in the developing Asian economies. Had monetary policy been tightened adequately in order to defend exchange rates in the first part of 1997, it is possible that the crisis might have been moderated, if not avoided. Monetary policy after exchange rates were floated One of the most controversial aspects of post-float policy has been the appropriate stance of monetary policy. From a theoretical standpoint, the jury is still out on the usefulness of monetary policy tightening once the exchange rate is floated after a speculative attack. Proponents of tightening point to the usefulness of keeping rates high in order to make domestic assets attractive and to help contain inflation expectations following a nominal depreciation. Detractors argue that by weakening the financial system and corporate balance sheets, and by depressing economic activity, higher rates may further reduce country creditworthiness and thereby heighten downward pressures in the currency. Both positions have merit and economic theory offers little guidance as to which deserves greater weight. Recent experience also fails to offer decisive guidance on the most appropriate monetary policy immediately following a float forced by a speculative attack. There is little in the Asian post-float experience to convincingly support the view that higher domestic interest rates did help to support the exchange rate. Currency values, for example, fell as much in countries that raised interest rates sharply — Thailand and Korea — as in countries where interest rates were raised by less, such as Malaysia. These trends, of course, mostly reflect the endogeneity of both the exchange rate and interest rates to swings in investor confidence. Countries where investor sentiment declined most strongly both experienced sharper falls in currency values and were required to raise interest rates higher to prevent even sharper depreciation. This suggests that, during the months following devaluation, exchange rates were driven as much by broad concerns about creditworthiness as by concerns about interest rate differentials. These considerations suggest that, once the exchange rate is floated and broader concerns about an economy’s financial position emerge, there is a limited contribution that monetary policy can make to stabilize the situation. Of course, by abandoning an exchange rate peg, a reliable nominal anchor is lost at a time when the devaluation threatens higher inflation; it is essential that monetary policy be conducted with appropriate attention to controlling inflation. Striving to keep real ex ante interest rates positive may be a reasonable benchmark for postdevaluation monetary policy. Once the exchange rate stabilizes and inflation expectations moderate and pressure on the capital account eases, it may be useful and appropriate to lower interest rates. The interest rate policies eventually followed by the Asian countries roughly followed this pattern. At present, in fact, nominal and real interest rates are below their predevaluation levels. At the same time, the increase in inflation has been very modest. Fiscal policy during the financial crisis In retrospect, it seems clear that the initial objectives for tightening fiscal policy set by the IMF for the affected Asian countries were inappropriate. The markets clearly recognized that fiscal profligacy was not behind the crisis and did not view fiscal austerity as a policy that was likely to resolve the crisis. Output in these countries has declined by more than anyone anticipated, and so fiscal loosening rather than fiscal tightening is required. An important source of initially inappropriate fiscal targets may have been poor forecasts. As forecasts were adjusted, new fiscal targets had to be negotiated, because the targets themselves were set in terms of the overall rather than the structural deficit. This renegotiation took time and often appeared to put the Asian economies in the position of asking for relief from IMF conditionality, undermining investor confidence, rather than as a disciplined and appropriate response to changing conditions and more realistic forecasts. This suggests setting targets in terms of structural deficits, or at least allowing built in fiscal stabilizers to continue to operate. However, estimates of structural deficits are only now being developed for Asian countries and such estimates may not be straightforward enough to form the basis for IMF performance criteria. But the principle should be respected. IV. Banking and Corporate Debt Problems Weaknesses in the financial sector and excessive leverage in the corporate sector clearly contributed to the crisis in the emerging Asian economies. Indeed, the defining character of these crises was the intersection of currency, banking, and corporate debt crises. The weakness in the financial sector, in turn, was encouraged by the moral hazard associated with perceived wide-ranging government guarantees and political interference in lending decisions by banks. As a result, banks had insufficient incentives to manage their credit risks and firms had inadequate incentives to limit their leverage and make sound investments. There are two broad lessons that emerge from this episode and earlier experiences involving financial crises. First, to reduce the vulnerability of an economy to banking and financial crises, a high priority should be given to sound corporate governance, narrow and explicit government guarantees, and adequate prudential supervision of banks. Second, while it is of course desirable to encourage robust institutions to minimize the likelihood of such problems in the future, once the crisis has occurred, the first priority should be to repair the damage done by the crisis to banking and corporate balance sheets. Corporate balance sheets need to be de-levered and banking systems need to be restructured and recapitalized in a pro-active and timely manner, or insolvent banks and corporations will continue to be an enormous macroeconomic weight on the economy and a serious obstacle to recovery. What do emerging market economies need to do? Some financial sector safety net appears to be essential to avoid bank runs and promote systemic stability. But, safety nets should be narrow and explicit, as opposed to broad and implicit. As a general principle, it is constructive to have safety nets in place that protect small depositors at depository institutions and thereby protect the functioning of the payments system from bank runs in the face of severe adverse shocks. Elsewhere market discipline, supported by effective disclosures and sound corporate governance, should be relied upon to control risk taking. Even narrow, explicit safety nets for the banking sector result in moral hazard incentives for excessive risk taking, and therefore must be complemented with adequate prudential supervision. Such supervision not only promotes the safety and soundness of the banking system, but also limits the government’s contingent liabilities associated with the safety net. Still, there are limits to the ability of supervisors and examiners to monitor banks effectively and control their risks. Market discipline therefore has to be enhanced to support sound corporate governance and complement bank regulation and supervision. The practices of directed lending to support government priorities and lending to wellconnected firms undermined normal incentives for prudent behavior by both banks and business customers. Poor incentives on the part of both lenders and borrowers are a recipe for the insolvency of both. Therefore, improved corporate governance is an essential part of structural reform, encouraged by freeing banks from political interference in lending. It is difficult to see how the economies can get back to sustainable growth without taking the necessary steps to strengthen their banking sectors. What needs to be done includes a familiar list: restructuring loans, taking losses, recapitalization, improving corporate governance and disclosure, and enhancing supervision. However, unlike Japan, the burden of recapitalizing the banks is likely to be a significant burden on Asian emerging market economies and they may lack the technical expertise to accomplish the steps necessary for successful banking system restructuring on their own. Foreign technical assistance, international official financial support and/or foreign bank investments will be required. The debt problems of banks are closely related to the excessive leverage and weak financial conditions of the corporate sectors in these economies. So, resolving financial sector weaknesses means both restructuring and recapitalizing banks and orderly workouts of the debt problems of their corporate sectors. Another clear lesson from the Asian crisis is that widespread insolvencies in the nonfinancial sector can be even more difficult to remedy than banking sector problems. The absence of adequate bankruptcy laws and procedures has in many cases meant that there was an absence of established mechanisms for allocating the burden of excessive debt problems among the borrowers and the lenders. What can industrial countries do? First, we need to continue work by expert groups to develop standards. An excellent example of an effective process and excellent execution is the Core Principles for Effective Banking Supervision produced by the Basle Committee on Bank Supervision. The process that produced this set of standards sets an important standard of its own. The experts should set and, as necessary, update standards in a cooperative effort of supervisors and regulators around the world. It is important that these efforts include emerging market economies. Second, we need to improve monitoring of compliance with these standards. In particular, the IMF is incorporating into its country assessments compliance with international standards for banking and bank supervision. Third, we need to have sufficient resources dedicated to technical assistance for countries that are working to converge to best-practice standards and incentives for countries to comply. Market discipline, encouraged by more limited safety nets and enhanced disclosure, could play an important role here. This could be reinforced by market access policy in developed countries; i.e., limiting access to domestic banking markets to banks from countries who meet international standards for bank supervision. Finally, proposals for pre-conditionality are intriguing, though fraught with practical problems and obstacles. There have recently been proposals for contingency funds for countries that met certain conditions, perhaps including compliance with international standards. This might be a way of enhancing incentives to comply with international standards. However, questions that have to be resolved include: Why would emerging market economies want to participate, if doing so singles them out as in potential need of liquidity lines? This may be similar to the reluctance of banks to borrow from the Federal Reserve discount window. Would the IMF (or whoever is implementing the lines) be willing to remove access to the liquidity facility if policies and conditions deteriorated in the country in question and threaten to precipitate a crisis in the process? Do we know enough about early warnings of crises to identify countries that meet appropriate standards and therefore deserve to qualify for such a facility? Moral hazard incentives affect foreign as well as domestic lenders. It is, therefore, important to find ways to ensure that foreign private lenders bear the consequences of the risks they take. Imposing losses on creditors will, of course, limit their willingness to extend credit to other borrowers. Doing so in the midst of a crisis is obviously problematic. Deciding how and when to involve the private sector in responding to international financial crises remains a challenge. Progress can be made at the margins. In particular, it might be worthwhile to look for ways to encourage the inclusion of collective action clauses in sovereign bond contracts to encourage greater cooperation among creditors when financial crises occur. Another promising direction is to promote the adoption of sound bankruptcy codes in emerging market countries to handle private debts more effectively. These measures can move the process in the right direction, but they are no panacea. We must continue to struggle to find ways to contain and resolve international financial crises without offering undue protection to international investors. Industrial countries, as well as the emerging market economies, have supervisory issues related to emerging country risk exposures. Better supervision in the industrial countries would insure better focus of lending banks on risks associated with lending to emerging market countries, reinforcing efforts to lesson moral hazard associated with such lending. Industrial countries should continue to support international financial institutions so they have the resources to provide liquidity support and to assist in designing programs to mitigate the crisis and promote structural reform. Finally, when appropriate, industrial countries can adjust their macro policies to offset the restraint on their growth from spillover effects from the crisis countries and thereby ensure that they remain anchors in the world economy.
|
board of governors of the federal reserve system
| 1,999 | 4 |
Remarks by Mr Edward M Gramlich, a member of the Board of Governors of the US Federal Reserve System, before the Wharton Public Policy Forum Series in Philadelphia, Pennsylvania on 22 April 1999.
|
Mr Gramlich gives his views on stabilization policy strategy Remarks by Mr Edward M Gramlich, a member of the Board of Governors of the US Federal Reserve System, before the Wharton Public Policy Forum Series in Philadelphia, Pennsylvania on 22 April 1999. ∗ Former Federal Reserve Chairman William McChesney Martin had a famous line about how to conduct monetary policy: „You have to take away the punch bowl when the party is warming up.“ While that may seem straightforward guidance (if not always easy politically!), it is not so simple in practice. Considering the broader question of stabilization policy, there are two sets of authorities that could take away the punch bowl, monetary policymakers or fiscal policymakers. Which should, under what conditions? Another question involves the heat — exactly how warm should the party be? A third involves timing — should the bowl be taken away slightly in advance of when the party is expected to warm up, or once it is clear that the party really has warmed up? Economists and others have debated these questions of stabilization policy strategy for years, with many issues still unresolved. In this talk I take the opportunity to give my own views. Fiscal and Monetary Policy in Theory In domestic macroeconomic theory either fiscal or monetary policy can be used to stabilize output and employment around their trend levels, and hence prevent booms or recessions from getting out of hand. When this domestic model is broadened to open the economy to international trade and capital flows, this conclusion no longer holds but the results depend on the flexibility of the nation’s exchange rate. Suppose first that a nation’s exchange rate is flexible, which means that the central bank generally does not intervene in currency markets and allows private markets to set the value of its currency. In response to an incipient recession, expansionary fiscal policy will raise interest rates, attract international funds, drive up the value of the nation’s currency, and reduce net exports. In a strict small country model, this process will continue until the fiscal expansion has no impact at all on the nation’s output and is ineffective as a stabilization policy measure. In less strict models the international link greatly reduces the expansionary impact of the fiscal change and still makes it generally unproductive to use fiscal policy for stabilization purposes. By contrast, the impact of a monetary expansion is enhanced by flexible exchange rates. Monetary expansion will lower interest rates, lower the value of the nation’s currency, raise net exports, and generally have a more stimulatory impact on output and employment than it would have without changing exchange rates. In this exchange regime it makes sense for a country to rely on monetary policy as its primary stabilization tool and let fiscal policy influence the nation’s overall saving rate. It might seem like this assignment of responsibilities would consign fiscal policy to oblivion, but in fact not so. In the long run the most important policy a country has to influence its long run living standards is fiscal policy, operating through just this influence on national saving. Interestingly, these conclusions are totally inverted if a country is following a fixed exchange rate policy, which means that its central bank intervenes in markets to set the value of its currency. A great variety of exchange arrangements fall into this category — a gold standard, a currency board, dollarization, and exchange rate zones would all be considered as fixed exchange rate regimes for these purposes. In these cases, central bank policy (if indeed there is a domestic central bank at all) must be dedicated to setting the exchange rate and cannot be used to stabilize output and employment. ∗ Mr Gramlich presented an identical speech at the Jerome Levy Economics Institute, Bard College, Annandale-on-the-Hudson, New York on 23 April 1999. Interest rates cannot deviate from the level necessary to determine the pre-set exchange rate. Fiscal policy becomes the pre-eminent stabilization tool by necessity. As expansionary fiscal policy threatens to raise interest rates and drive up the exchange rate, the process induces a rise in the money supply to preserve the fixed interest and exchange rates, accommodating a larger rise in output than would be the case in a closed economy. For this reason the impact of fiscal policy on the economy is even greater than it would be in a closed economy. Timing Issues Since stabilization policy is dedicated to mitigating business cycles that can be of relatively short duration, timing also matters in developing a stabilization strategy. Fiscal policy contains automatic stabilizers — basically tax revenues that rise in booms and fall in recessions, hence stabilizing overall spending demands. These stabilizers cut the amplitude of cycles in output and employment, but they do not eliminate the cycles. To eliminate cycles altogether, policymakers must act in time for their policies to offset incipient booms or recessions. The average postwar recession in the United States has been slightly less than a year long, which narrows the time horizon for what are known as discretionary policies considerably. Two types of lags can cause problems, inside lags and outside lags. Inside lags involve the time between the need for action and the action, outside lags the time between policy action and its impact on the economy. An inevitable component of the inside lag is simply the time it takes policymakers, whether on the monetary or fiscal side, to recognize shifts in economic indicators. Policymakers can rely on forecasting models, but these can be sufficiently wide of the mark that policymakers often prefer to see real world data before they act. In most countries first reports on economic data come out about a month after the quarter or month for which the data are reported, but these first reports are subject to considerable short run noise, often revised substantially, and in fact are themselves often little better than forecasts. If there were a sharp but unexpected change in the pace of activity, policymakers would probably not have a good statistical idea of this change until a month or more after the fact. The other component of the inside lag is the time between recognition and action. For monetary policy this time period is usually relatively short. Central bank monetary policy committees generally meet at two to six weeks’ intervals throughout the year. Normally these committees respond to changes in economic information at the first meeting after the fact, sometimes the second. If the change were to be extreme, most committees have the ability to make inter-meeting changes in policy. Summing the two components, for monetary policy the inside lag is probably on the order of a quarter, less if the monetary authorities are prepared to act on the basis of forecasts. For fiscal policy, the time between recognition and action is long in parliamentary countries, seemingly interminable in the United States. In the U.S. if the need for action is recognized in the fall of the year, the President’s budget message can reflect that. But the budget message, which comes out in early February, lays the groundwork for Congressional debate on the budget, which most of the time is not quite completed by the time the fiscal year starts in October. Hence, at best this component of the inside lag for U.S. discretionary fiscal policy is about a year, already longer than the average recession, and the lag can be substantially longer than that if the relevant action is at all controversial. In parliamentary countries it is possible to act faster, but even in these countries there can be substantial procedural bottlenecks to altering a budget that has already been submitted. Most of the time even parliamentary countries will wait for the next budget cycle to incorporate new discretionary fiscal policies. The second lag is what is known as the outside lag — the time between the action being taken and an observable impact on the real economy. For monetary policy this lag was formerly thought to be long — on the order of nine months to a year in most developed countries. Monetary policy has traditionally operated through changes in short term interest rates, which then change long term rates with some lag, and real spending with some further lag. But there are reasons why this lag may have speeded up in recent years — credit markets have now become more forward-looking and asset values have a sizeable and relatively quick impact on spending through the wealth effect. For fiscal policy, tax changes that alter withholding probably operate within a quarter or two, though those that raise spending such as for construction can take many years to plan routes, purchase land, let contracts and the like. The upshot of all of this is that it does seem possible to use monetary policy as an effective stabilization instrument. Because of its relatively short inside lag and possibly reduced outside lag, monetary actions are likely to have some effects within a half-year of the recognition of the need for a change, perhaps even faster when authorities act on the basis of forecasts. But what is possible for monetary policy seems basically impossible for discretionary fiscal policy. When the extra year-long piece of the inside lag is added on, it is hard to escape the conclusion that fiscal policy should be used for stabilization purposes only in the deepest and longest of recessions. If an economy has flexible exchange rates, no harm is done by this conclusion, because the economics of flexible exchange rates suggest that monetary policy is best used for stabilization and fiscal policy for long-term policies in any case. If the economy operates under fixed exchange rates, the lesson here is that discretionary fiscal policy should generally only be used to deal with long run needs. The only operative stabilization force in these economies is then the automatic fiscal stabilizers — indeed, this suggests a potential problem for countries on fixed exchange rates. Fiscal Policy in the United States Turning now from the general to the specific, I examine these issues in more detail for the United States. The first point to make is that for nearly three decades the U.S. has operated under flexible exchange rates. Neither the Federal Reserve nor the Treasury typically intervenes in exchange markets to try to influence exchange rates. This means both that monetary policy is free to operate on stabilization needs, and should be able to do that effectively. Fiscal policy, on the other hand, is best devoted to longer run considerations. The second salient point is that for two decades American fiscal policy has been hamstrung by low national saving rates. Much attention lately has been devoted to the low rate of personal saving in the United States — the rate that just recently dipped below zero. But while personal saving is one important component of national saving, it is only one component. A country with high business and/or government saving can provide plenty of resources for new capital investment, even with low personal saving. The overall national saving rate, based on a summation of saving in all sectors of the economy, is the key indicator of how much a country is providing for its future. Through an accounting identity, it can be shown that this overall national saving rate equals output less private and public consumption, perhaps an easier way to think about the concept. This overall rate of net national saving in the United States averaged 11.4 percent of net national product from 1950 to 1970, but then fell to 6.1 percent in the 1980s and to 5.2 percent in the 1990s. In the long run this drop is bound to show up in a reduced growth path for per capita output. Now, when the United States is on the verge of a dramatic rise in potential entitlement spending, seems a particularly poor time for such a sharp drop. But national saving is a long run concept, and the fact that the U.S. rate has fallen does not inevitably make for economic problems. Indeed, Alan Greenspan has recently attributed much favorable recent economic performance to significant capital gains on wealth. These wealth gains are not counted in income from production but they raise consumption, implying a drop in national saving. They also can raise investment, implying an investment boom financed by the saving of foreigners. Of course the current account balance of payments deficit implied by the rise in investment and drop in saving may not be sustainable. Nor might the capital gains persist. If a nation could consistently count on large-scale capital gains and foreign capital inflows, it would not need as much saving. But capital gains come and go, and inflows may not be sustainable. To provide a strong and consistent basis for future growth, a developed nation will generally need to do its own saving. While there are a number of policy changes that could in principle raise national saving, the tried and true method is through contractionary fiscal policy. Higher budget surpluses imply lower public and/or private consumption and permit more funds to be devoted to capital investment. Making the same point from another perspective, higher surpluses retire some of the outstanding public debt and free up more funds for investment. In this regard, the recent return of overall budget surpluses is most welcome, and it has already begun to raise national saving. The overall net national saving rate in the budget surplus year of 1998 was 7.5 percent, significantly above the decade average for the 1990s, despite the fact that the personal saving rate fell. Since contractionary fiscal policy is at the heart of the national saving issue, it is natural to seek out budget procedures that promote saving. One, used by many national governments and most American states, is the convention of separating accounts into a current and capital budget. There would normally be restrictions against borrowing on current account, but not against borrowing to finance capital investment. While there have been arguments that the U.S. federal government should use a capital budget, such a change is unlikely to help promote saving in the present environment. One problem is that the federal government does remarkably little direct investment spending, so the current budget deficit or surplus would differ little from the overall deficit or surplus. And, at the present time the main policy issue is that the current budget is likely to run a surplus, in which case restrictions against current deficit spending would be irrelevant. Countries that have come upon large petroleum resources have hit on another contractionary budget procedure. They have often devoted the resources to a special trust fund, insuring that the rise in income does not translate to a rise in consumption. For many years the United States has had a budgetary device that operates in a similar manner, involving its Social Security trust fund. While generally considered within the federal budget, Social Security has operated as a budget within a budget — being financed so that current and future payroll tax revenues are sufficient to cover the current and future benefits scheduled under present law, looking ahead for the next 75 years. Several times in the past two decades this long run actuarial budget constraint for Social Security has been responsible for cuts in future benefits to bring the forward-looking Social Security budget into long term actuarial balance. Many have argued for strengthening this separation by removing Social Security altogether from the federal government’s budget. In his proposed budget for year 2000 and beyond, President Clinton came up with another way to promote national saving, ironically by reducing the segmentation between Social Security and the rest of the budget. He proposed making general revenue transfers out of the anticipated general budget surplus and to the Social Security trust fund. The idea is similar to the special trust fund employed by oil producing countries. While such a move may make for more responsible general budgets, one wonders what happens on the other side of the transfer. Since Social Security’s long term budget constraint has been responsible for significant forward benefit cuts, it may not be as feasible, or as easy, to make these cuts in future benefits if there is now the tradition of making general revenue transfers to fill Social Security’s revenue gaps. What may encourage national saving in one budget may discourage it in another. While these budget structures are interesting and potentially worth exploring, my own view is that they are not likely to work in the United States. To me, a capital budget is unlikely to solve any problems and the arrangements involving the Social Security trust fund are best left alone. In the end, I believe that the way to promote the desirable fiscal goal of raising national saving is simply to argue for it. Paying down the national debt reduces interest costs and adds to long term fiscal flexibility. The implicit rise in saving also generates new funds for raising investment, adding skilled jobs and raising living standards in the long run. One would hope that these benefits could be defended in their own right. Monetary Policy Previously we saw that it often makes sense to use monetary policy for stabilization purposes. Precisely how is this to be done? The textbook way of thinking about monetary policy still runs in terms of quantities. The central bank operates so that some measure of the money stock or liquidity grows in some relation to the desired growth in real output. But the sharp changes in money velocity in the early 1990s have changed many economists and central bankers from being money quantity watchers to being interest rate watchers. Rather than trying to guess the optimal rate of monetary expansion, many analysts now focus directly on interest rates — should they be lower, higher, or the same? John Taylor has worked out a simple way of thinking about this question. Under what is known as the Taylor Rule, monetary authorities first determine an equilibrium level of a target interest rate, say the real federal funds rate. The authorities can then adjust the actual funds rate relative to this equilibrium rate depending on inflation relative to its target, and unemployment relative to its target. If, for example, inflation were above its target and unemployment were close to its target, monetary authorities would raise the real funds rate above its equilibrium value. While Taylor’s Rule provides a useful framework for policymakers to think about policy, as an operating rule it too is beginning to encounter empirical difficulties. The Taylor Rule requires precise point estimates of the equilibrium real funds rate and targets for inflation and unemployment. None of these estimates is easily come by. The equilibrium real funds rate is difficult to estimate, and may change with productivity growth and national saving rates. On the inflation side, there are a number of different price indices, any of which could be used as a target. All of these indices can tell different stories about whether or not inflation is heating up, and there are a number of measurement problems with the Consumer Price Index, the most widely watched of these measures. On the unemployment side, the fear that inflation will accelerate if unemployment drops below an estimated natural rate has at this point proven groundless — inflation has not accelerated when unemployment has remained in a zone that would have been felt to be well below the estimated natural rate. Over this recent period the Taylor Rule would have called for higher federal funds rates to raise unemployment, a policy change that may well prove to be unwarranted when the final history is written. One ad hoc remedy would be to drop the unemployment term, the one that seems to be giving trouble, from the Taylor Rule. Less drastically, the policy weight on the inflation term might be raised and that on the unemployment term lowered. In these cases the Taylor Rule comes very close to an inflationtargeting rule, which indeed many economists have also advocated. But there is uncertainty about more than the unemployment target, and there may be an even better way to modify the Taylor Rule. Whenever there is doubt about the point estimates of the equilibrium funds rate, the inflation target, or the unemployment target, the Taylor Rule can be converted to a change rule. If levels of inflation and unemployment seem to be at least within their target bands, if not at unknown point estimates of the target, monetary policy can just try to keep inflation and unemployment in these desirable bands. Policy would respond only when movements in the economy threaten to take inflation and/or unemployment out of their preferred bands. Solving a standard model of the macro-economy, such a policy would effectively convert monetary policy into what might be called „speed limit“ form, where policy tries to insure that aggregate demand grows at roughly the expected rate of increase of aggregate supply, which increase can be more easily predicted. This version of the Taylor Rule goes back to the spirit of Martin’s initial remark, where the monetary authority is happy with the cocktail party temperature at present but moves against anything that increases its warmth. Should demand growth threaten to outrun supply growth (the party to warm up), the seeds of accelerating inflation may be planted and monetary policy should curb the growth of demand by raising interest rates. Should demand growth threaten to fall behind supply growth, rising unemployment is probably in the works, and monetary policy should try to boost the growth in demand by lowering interest rates. As long as inflation and unemployment remain in the acceptable band, this change version of the rule simply tries to maintain a good thing, without requiring precise quantification of inflation and unemployment policy goals. This approach has not addressed the question of supply shocks, which are dealt with in the general formulation of Taylor’s Rule. But one can incorporate supply shocks into the change rule as well. If there are temporary supply shocks that do not seem likely to be incorporated into the broader inflation process, they should be ignored. If there are positive or negative shocks that do seem likely to be incorporated, the change rule may not work well, and one may have to go back to the general form of the rule. Policy Strategies This all amounts to what might be considered a stabilization policy strategy. Given the flexible exchange rate regime, fiscal policy should be used to influence overall national saving, which is still lower than it has been for most of the postwar period for the United States. I personally would argue for a rise in national saving, especially in view of the likely increase in future government entitlement spending. But I would not argue for any of the recently suggested changes in budget procedures that are alleged to help in this process. On the monetary side, authorities should try to stabilize the economy without anticipating help from fiscal policy. Generally this would involve following the dictates of the Taylor Rule, assuming that one can choose the equilibrium real funds rate and target values for inflation and unemployment. If not, and if both inflation and unemployment seem to be safely within their target bands, this policy could be simplified to a change form — in which interest rates are used to keep the growth in aggregate demand near the more easily predictable growth in aggregate supply.
|
board of governors of the federal reserve system
| 1,999 | 4 |
Speech by the Chairman of the Board of Governors of the Federal Reserve System, Alan Greenspan, at the World Bank's conference on Recent Trends in Reserves Management, Washington, D.C., on 29 April 1999.
|
Mr Greenspan discusses recent trends in the management of foreign exchange reserves Speech by the Chairman of the Board of Governors of the Federal Reserve System, Alan Greenspan, at the World Bank’s conference on Recent Trends in Reserves Management, Washington, D.C., on 29 April 1999. One way to address the issue of the management of foreign exchange reserves is to start with an economic system in which no reserves are required. There are two. The first is the obvious case of a single world currency. The second is a more useful starting point: a fully functioning fully adhered to, floating rate world. All requirements for foreign exchange in this idealised, I should say, hypothetical; system could be met in real time in the marketplace at whatever exchange rate prevails. No foreign exchange reserves would be needed. If markets are functioning effectively, exchange rates are merely another prices to which decision makers--both public and private--need respond. Risk-adjusted competitive rates of return on capital in all currencies would converge, and an optimised distribution of goods and services enhancing all nations standard of living would evolve. Public and private market participants would require only liquid reserves denominated in domestic currency. And in the case of a central bank of a fiat currency regime, such reserves can be created without limit. But, clearly, the real world is not perceived to work that way. Even if it did, it is apparent from our post World War I history, that national governments are disinclined to grant currency markets unlimited rein. The distributions of income that arise in unregulated markets have been presumed unacceptable by most modern societies, and they have endeavoured, through fiscal policies and regulation, to alter the outcomes. In such environments it has been the rare government that has chosen to leave its international trade and finance to what it deems the whims of the marketplace. Such attitudes very often are associated with a mercantilist view of trade that perceives trade surplus as somehow good, deficits bad. Since in the short run, if not in the long run, trade balances are affected by exchange rates, rates that are allowed to float freely are few and far between. In a crisis, of course, monetary authorities are often overwhelmed, and lose any control of the foreign exchange value of their domestic currency. Most nations, for good or ill, have not been indifferent to the foreign exchange value of their currency. I say most, but not all. Arguably, immediately following the dollar s float in 1973, U.S. Authorities did not intervene and left it to others to adjust their currencies to ours. We did not sense a need to hold what we perceived to be weaker currencies in reserve because presumably we could always purchase them in the market, when, and if, the need arose. We held significant reserves in only that medium we judged a “harder” currency that is gold. It has become a general principle that monetary authorities reserve only those currencies they believe are as strong or stronger than their own. Thus, central banks reserve balances except in special circumstance hold no weak currencies of which I am aware, other than standard transaction balances that are not viewed as stores of values. We in the United States built up modest reserve balances of DM and yen only when we perceived that the foreign exchange value of the dollar was no longer something to which we could be indifferent, as when, in the late 1970s, our international trade went into chronic deficit, inflation accelerated, and international confidence in the dollar ebbed. The choice of building reserves in a demonstrably harder currency is almost by definition not without costs in real resources. The budget cost of paying higher interest rates for the domestic borrowings employed to purchase lower yielding U.S. dollar assets, for example, is a transfer of real resources to the previous holders of the dollars. The real cost of capital because of risk is higher in a weaker currency country. Countries with weaker currencies apparently hold hard currency reserves because they perceive that the insurance value of those reserves at least equal they re cost in real resources. Reserves, like every other economic asset, have value but involve cost. Thus, the choice to build reserves, and in what quantities, is always a difficult cost-benefit tradeoff. In general, the willingness to hold foreign exchange reserves ought to depend largely on the perceived benefits of intervention in the foreign exchange markets. An evaluation along these lines would appear to require a successful model of exchange rate determination, and a clear understanding of the influence of sterilised intervention. Both of the above have proved to be a challenge for the economics profession. The two main policy tools available to monetary authorities to counter undesirable exchange rate movements are sterilised intervention operations in foreign exchange markets and monetary policy operations in domestic money markets. Empirical research into the effectiveness of sterilised intervention in industrial country currencies has found that such operations have at best only small and temporary effects on exchange rates. One explanation for the limited measurable effectiveness of sterilised intervention is that the scale of typical operations has been insufficient to counter the enormous pressures that can be marshalled by market forces. In one sense, this is true by definition. Another is that the assets bought and sold in intervention operations are such close substitutes in the minds of investors that they willingly accept changes in the currency composition of their holdings without compensating changes in asset prices or exchange rates. A more recent strand of research into this topic claims that intervention operations can be effective when they signal future monetary policy operations, which are perceived to be more effective in altering asset prices, including exchange rates. The problem with this view is that it means that sterilised intervention is not an independent tool that can be used to influence exchange rates. It needs a supporting monetary policy stance to be effective. We are left with the conclusion that foreign exchange market-sterilised intervention by itself has only a limited impact on exchange rates. This is underscored by the reported purchase by Japanese authorities of roughly $20 billion against yen in April of last year that barely budged the dollar/yen exchange rate. Hence, reserve assets do not expand, in a meaningful way, the set of macroeconomic policy tools that is available to policy makers in industrial countries. In addition, there is scant evidence that the rapid development of new financial instruments and products has undermined the liquidity, efficiency, or reliability of the market for major currencies. U.S. monetary authorities have intervened only once in foreign exchange markets since August of 1995. It seems likely that industrial countries official needs for foreign exchange reserves is more likely to have declined over time, than to have increased. The introduction of the Euro is clearly going to significantly alter reserve holdings. As markets for Euro-denominated assets develop, the Euro should become increasingly attractive as a world reserve currency. The bid-ask spreads on average of, say, the separate currency government bonds of the Euro-11 countries before January 1, were wider than the spreads on average that should eventually emerge for new Euro-denominated issues. Such increased liquidity should reduce the cost of holding reserves, though conceivably the credit risk of bonds, not denominated in a currency fully controlled by a domestic central bank, would rise. To some extent the increased attractiveness of the Euro should reduce the demand for dollars. But history suggests that this effect is likely to be limited and evolutionary. While the stock of foreign exchange reserves held by industrial countries has increased over time, those increases have not kept pace with the dramatic increases in foreign exchange trading or gross financial flows. Thus, in a relative sense, the effective stock of foreign exchange reserves held by industrial countries has actually declined. In recent years volatility in global capital markets has put increasing pressure on emerging market economies and this has important implications for financial management in those economies. There have been considerable fluctuations in the willingness of global investors to hold claims on these economies over the last two years. Between 1992 and 1997, yields on a broad range of emerging market debt instruments fell relative to those on comparable debt instruments issued by industrial country governments. But this pattern reversed sharply with the onset of the Asian financial crisis in the second half of 1997, and again following the ruble’s devaluation in August of 1998. These changes in foreign investor’s willingness to hold claims on emerging market economies had a particularly severe impact on currencies operating under fixed or pegged exchange rate regimes. Accordingly, those countries foreign exchange reserves, and reserve policy, played an important role in the recent financial crises. In both Thailand and Korea the monetary authorities allowed their foreign exchange reserves, net of forward contracts and other obligations, to fall almost to zero. Once this became obvious to market participants, subsequent downward pressure on the baht and the won intensified substantially. In contrast, a number of countries (Taiwan and Singapore, for example) introduced greater exchange rate flexibility without exhausting their foreign exchange reserves. These countries did not suffer the same violent downdrafts in their foreign exchange markets. In recent years Hong Kong and China have all accumulated substantial stocks of foreign exchange. While the motives for these buildups were not all economic, they may have helped these economies to weather recent financial turbulence at less cost than other emerging market economies in the region. The Asian crisis has focused attention on the adequacy of information about official reserves. In Thailand and Korea, in particular, limited disclosure of these data by the authorities contributed to misperceptions by market participants of resources available to the authorities to maintain the prevailing exchange rate regime. Moreover, once the crisis broke, inadequate data undermined efforts by the international financial community to resolve the situation. In response, the G-10 central banks initiated an effort to establish standards for disclosure of on- and off-balance-sheet foreign currency activities of the public sector by countries that participate, or aspire to participate, in international capital markets. The focus of this work was the authorities foreign currency liquidity position, which consists of foreign exchange resources that can be easily mobilised, adjusted for potential drains on those resources. While greater disclosure is not a panacea for international financial crises, adherence to the standards developed in the wake of the 1997 crisis would go a long way toward preventing future stresses and facilitating responses to those that do occur. Some have argued that an equally important issue is a disclosure standard for private participants in international capital markets, especially highly leveraged entities. Such disclosure could be useful, and work on this topic is proceeding. But progress on official disclosure should not be delayed pending the outcome of these efforts. The Asian financial crises have reinforced the basic lesson that emerging market economies should pay particular attention to how they manage their foreign exchange reserves. But managing reserves alone is not enough. In particular, reserves should be managed along with liabilities--and other assets-to minimise the vulnerability of emerging market economies to a variety of shocks. In this context some simple principles can be outlined that are likely to be useful guidelines for policymakers. It may also be useful to consider somewhat more nuanced approaches to this problem. Considerable progress has been made in recent years in developing sophisticated financial instruments. These developments create added complexity that all financial market participants, including policymakers from emerging market economies, must manage. However, they also create opportunities that emerging market economies should seek to exploit. In doing so there are lessons they can learn from advances in risk management strategies developed by major financial institutions. In his remarks at the recent G-33 Seminar in Bonn, Pablo Guidotti, the Deputy Finance Minister of Argentina, proposed a simple guideline for policymakers in emerging market economies that a number of my colleagues at the Federal Reserve believe is worth considering. Guidotti suggested that countries should manage their external assets and liabilities in such a way that they are always able to live without new foreign borrowing for up to one year. That is, usable foreign exchange reserves should exceed scheduled amortisation’s of foreign currency debts (assuming no rollovers) during the following year. This rule could be readily augmented to meet the additional test that the average maturity of a country s external liabilities should exceed a certain threshold, such as three years. The constraint on the average maturity ensures a degree of private sector “ burden sharing” in times of crisis, since in the event of a crisis, the market value of longer maturities would doubtless fall sharply. Short-term foreign creditors, on the other hand, are able to exit without loss when their instruments mature. If the preponderance of a country s liabilities were short term, the entire burden of a crisis would fall on the emerging market economy in the form of a run on reserves. Some emerging countries may argue that they have difficulty selling long-term maturities. If that is indeed the case, their economies are being exposed to too high a risk generally. For too long emerging market economies have managed their external liabilities so as to minimise the current borrowing cost. This shortsighted approach ignores the insurance imbedded in long-term debt, insurance that is often well worth the price. The essential function of an external balance-sheet rule should be to make sure that actions of the government do not contribute to volatility in the foreign exchange market. Consequently it makes sense to apply the rule to all of the government s foreign assets and all sovereign liabilities denominated in, or indexed to, foreign currencies. Forward foreign exchange transactions should be recognised, as liabilities, while such things as contingent credit lines, if they are truly available on demand, should be counted as foreign currency assets. In addition, key contingent liabilities should be included. This means that the foreign currency assets and liabilities of financial intermediaries that have access to the safety net--e.g. banks--probably ought to be included in the scope of the analysis. It is important to note that adherence to such a rule is no guarantee that all financial crises can be avoided. If the confidence of domestic residents is undermined, they can generate demands for foreign exchange that would not be captured in this analysis. But controlling the structure of external assets and liabilities could make a significant contribution to stability. The adoption of any rule is not a substitute for appropriate macroeconomic, exchange rate, and financial sector policies. Indeed, the endeavour to substitute such a regime for the more difficult fundamentals of sound policy will surely fail. Countries that choose to follow this simple rule may reduce their vulnerability to financial crises. At a minimum this framework can highlight signs of vulnerability. For example, Korea s short-term debts, including those of Korean banks, were more than three times its foreign exchange reserves in December of 1996. An external balance-sheet rule could generate substantial benefits for the international community as well. If followed, it would likely limit the size of future international rescue packages, since the size of such packages is often related to the size of a country s short-term liabilities less its reserves. In applying any simple rule, it is important to anticipate endeavours to get around it. For example, the IMF has identified more than $30 billion in outstanding emerging market debt instruments with put options. This suggests that maturity calculations ought to eschew notional maturities that would not prevail in times of crisis. In any event, it would probably be desirable to move beyond simple balance-sheet rules and to work towards a standard that is stochastic, i.e., that takes into account the foreseeable risks that countries face. One approach would be to calculate a country s liquidity position under a range of possible outcomes for relevant financial variables (exchange rates, commodity prices, credit spreads, etc.). It might be possible to express a standard in terms of the probabilities of different outcomes. For example, an acceptable debt structure could have an average maturity--averaged over estimated distributions for relevant financial variables--in excess of a certain limit. In addition, countries could be expected to hold sufficient liquid reserves to ensure that they could avoid new borrowing for one year with a certain ex ante probability, such as 95 percent of the time. Such a “ liquidity-at-risk” standard could handle a wide range of innovative financial instruments--contingent credit lines with collateral such as the one maintained by Argentina, options on commodity prices, put options on bonds, etc.--in an appropriate manner. Such a standard would encourage countries to manage their exposure to financial risk more effectively. For example, such a standard could force countries to think realistically about the cost of selling put options with their bonds. Of course, this approach will not work if policymakers are committed to the letter, but not the spirit, of the exercise. There is no credible way to fully preclude a counterproductive effort to gain costless benefits with new financial products that convert long-term liabilities to short. Clearly it would not be feasible at present for most emerging market countries to implement a policy regime based on liquidity at risk. It might not even be feasible for most emerging market economies to adhere to a simpler external balance-sheet rule, since many countries will require some time to build up foreign exchange reserves, and to adjust the structure of their external liabilities. It is almost certainly desirable, however, for countries to begin to think about managing their assets and liabilities, or just monitoring their vulnerabilities, in a more sophisticated way. An external balance-sheet rule is probably a good place to start. Over the medium term, it would be desirable for emerging market economies to develop a more sophisticated approach to the problem of managing their liquidity. There is an obvious connection between “ value-at-risk” techniques used by large financial institutions to manage their exposure to risk and the liquidity-at-risk approach proposed here. It would be productive were those large financial institutions to play a role in helping countries develop their own capabilities to implement this approach, perhaps with technical assistance from G-7 supervisory authorities and international financial institutions.
|
board of governors of the federal reserve system
| 1,999 | 5 |
Remarks by the Chairman of the US Federal Reserve System, Mr Alan Greenspan, at the 35th Annual Conference on Bank Structure and Competition of the Federal Reserve Bank of Chicago on 6 May 1999.
|
Mr Greenspan discusses the US economy in a world context Remarks by the Chairman of the US Federal Reserve System, Mr Alan Greenspan, at the 35th Annual Conference on Bank Structure and Competition of the Federal Reserve Bank of Chicago on 6 May 1999. Any evaluation of the international financial crisis of the past two years would be incomplete without an understanding of the extraordinary strength of the U.S. economy that has acted as a buffer for much of the rest of the world. There can be little doubt that the marked widening in our trade deficit on goods and services has played an important, possibly a critical role in supporting world stability during these trying years. Domestic demand growth in the United States has accounted for about one third of the world total since 1996. Now that there are tentative signs that we may be through the worst of the crisis abroad, an issue to which I shall return shortly, it would be useful to address the benevolent, but bedeviling, question of how the American economy, at least to date, has managed to remain an oasis of prosperity, in sharp contrast to badly sagging economies in the developing world, recession in Japan, and tepid growth in much of Europe. And, can we project how long our economy will be able to provide support to the rest of the world? Forecasts of inflation and of growth in real activity for the United States, including those of the Federal Open Market Committee, have been generally off for several years. Inflation has been chronically overpredicted and real GDP growth underpredicted. An increase in inflation doggedly forecast to follow the ever lower unemployment rate — now the lowest in three decades — has not occurred. In fact, even after accounting for the reduced bias in our price statistics resulting from methodological improvements, some measures of inflation have even continued to ease. Subdued inflation, of course, has resulted, in part, from the sharp fall in oil prices from mid1997 through early this year. Moreover, there has been a significant weakening in non-oil import prices since 1995, owing largely to a combination of declining world commodity prices and a strengthening dollar in our foreign exchange markets through last summer. The decline in prices of oil and non-imports are clearly “one-off” events. Indeed, oil prices have moved up sharply of late. The remaining easing in the inflation rate owes to disinflation in domestic value added excluding oil. The latter, in turn, is more than accounted for by a decline in the rate of increase in unit costs of output, inasmuch as profit margins have increased significantly since 1993. A pickup in the growth of labor productivity — beyond the effects of the business cycle — appears to have been an essential factor behind the slowing in inflation. Growth in labor productivity, which had averaged less than a 1 percent annual rate in the early 1990s, rose to approximately 3 percent in the four quarters ending in the first quarter of 1999. Increased labor productivity growth has directly lowered the growth of unit labor costs. And that reduction in costs, in combination with declining import prices, has also apparently engendered lower price inflation and inflation expectations that, in turn, have curbed both nominal wage growth and unit interest costs. Consolidated unit nonlabor costs, excluding interest, have played little role in the inflation accounting exercise in recent years. Apparently, the effects of falling equipment prices have matched the upside influence of shorter service lives on the average depreciation rate to keep unit capital consumption charges stable. I say that accelerated labor productivity is arguably at the root of declining domestic valuedadded price inflation because it can be misleading to identify a single variable as the exogenous force in an essentially interactive process of cause and effect. But in this case, the notion that labor productivity acceleration has been largely, though not wholly, exogenous, appears persuasive. Its unexpected emergence also apparently explains much of the shortfall in forecasting of overall economic growth in recent years. I have hypothesized before this group on several occasions that the synergies that have developed, especially among the microprocessor, the laser, fiber-optics, and satellite technologies, have dramatically raised the potential rates of return, not only on new telecommunications investments, but more broadly on many types of equipment that embody or utilize the newer technologies. The newest innovations, which we label information technologies, have begun to alter the manner in which we do business and create value, often in ways not readily foreseeable even five years ago. I do not say we are in a new era, because I have experienced too many alleged new eras in my lifetime that have come and gone. We are far more likely, instead, to be experiencing a structural shift similar to those that have visited our economy from time to time in the past. These shifts can have profound effects, often overriding conventional economic patterns for a number of years, before those patterns begin to show through again over the longer term. There was far greater justification to view the future with the unbridled optimism of a presumed new era a century ago than today. Technological change was truly awesome around the turn of the century. In a very short number of years the world witnessed an astounding list of new creations: electric power and light, radios, phonographs, telephones, motion pictures, x-rays, and motor vehicles, just to begin the list. As this century comes to an end, the defining characteristic of the current wave of technology is the role of information. Prior to the advent of what has become a veritable avalanche of IT innovations, most of twentieth century business decisionmaking had been hampered by limited information. Owing to the paucity of timely knowledge of customers’ needs and of the location of inventories and materials flows throughout complex production systems, businesses required substantial programmed redundancies to function effectively. Doubling up on materials and people was essential as backup to the inevitable misjudgments of the real-time state of play in a company. Judgments were made from information that was hours, days, or even weeks old. Accordingly, production planning required adequate, but costly, inventory safety stocks, and backup teams of people to maintain quality control and for emergency response to the unanticipated and the misjudged. Large remnants of information void, of course, still persist and forecasts of future events on which all business decisions ultimately depend are still inevitably uncertain. But the recent years’ remarkable surge in the availability of real-time information has enabled business management to remove large swaths of inventory safety stocks and worker redundancies, and has armed workers with detailed data to fine tune product specifications to most individual customer needs. Moreover, information access in real-time resulting from processes such as, for example, checkout counter bar code scanning and satellite location of trucks, fostered marked reductions in delivery lead times on all sorts of goods, from books to capital equipment. This, in turn, has reduced the relative size of the overall capital structure required to turn out our goods and services, and, as a consequence, has apparently added to growth of multifactor productivity, and thus to labor productivity acceleration. Intermediate production and distribution processes, so essential when information and quality control were poor, are being bypassed and eventually eliminated. The increasing ubiquitousness of Internet web sites is promising to significantly alter the way large parts of our distribution system are managed. The process of innovation goes beyond the factory floor or distribution channels. Design times have fallen dramatically as computer modeling has eliminated the need, for example, of the large staff of architectural specification drafters previously required for building projects. Medical diagnoses are more thorough, accurate, and far faster, with access to heretofore unavailable information. Treatment is accordingly hastened, and hours of procedures eliminated. In addition, the dramatic advances in biotechnology are significantly increasing a broad range of productivity-expanding efforts in areas from agriculture to medicine. The evident acceleration of the process of “creative destruction”, which has accompanied these expanding opportunities and which has been reflected in the shifting of capital from failing technologies into those technologies at the cutting edge, has been remarkable. Owing to advancing information capabilities and the resulting emergence of more accurate price signals and less costly price discovery, market participants have been able to detect and to respond to finely calibrated nuances in consumer demand. The process of capital reallocation has been assisted through a significant unbundling of risks made possible by the development of innovative financial products, not previously available. The proliferation of products of all different designs and qualities has opened up the potential for the satisfaction of consumer needs not evident even twenty years ago. The accompanying expansion of wealth has been large, though regrettably the gains are not as widely spread across households as I would like. Thus, in summary, the technological innovation observed in recent years does not appear to be a story of pure economic endogeniety. The aforementioned technologies appear largely sui generis to the post World War II period, indeed to the last twenty years. They were unanticipated, and evolved, for the most part, independent of the ebbs and flows of economic activity. While the amount of capital investment that has carried these technologies into use has been, of course, affected by the costs of capital and labor, and by consumer demand, the rates of return offered by this new equipment have been largely rooted in the relative state of technology. It is the observation that there has been a perceptible quickening in the pace at which technological innovations are applied that argues for the hypothesis that the recent acceleration in labor productivity is not just a cyclical phenomenon or a statistical aberration, but reflects — at least in part — more deep-seated, still developing, shift in our economic landscape. Indeed it remains a hypothesis, since in economics, unlike in the physical sciences, we can never conduct fully controlled experiments for an overall economy. The evidence, nonetheless, for a technology-driven rise in the prospective rate of return on new capital, and an associated acceleration in labor productivity is compelling, if not conclusive. Starting in 1993, capital investment, especially in high-tech equipment, rose sharply beyond normal cyclical experience, presumably, as always, the result of expected increases in rates of return. Had the profit expectations not been realized, one would have anticipated outlays to have fallen back. Instead, their growth increased through the remainder of the decade. More direct evidence supporting an improved underlying profitability has become increasingly evident in company data. It seems likely that the synergies of advances in laser, fiber optic, satellite, and computer technologies with themselves and with older technologies have enlarged the pool of opportunities to achieve a rate of return above the cost of capital. The bulge in the exploitation of these “excess” potential rates of return has been stimulated by the accelerating decline in the prices of high-tech equipment starting in 1995. The amount of capacity plowed into limited areas of technology led to extra heavy price discounting. The rapid pace of innovation that has fostered shortened product life cycles also has contributed to driving down prices of high-tech equipment. Few business people report any significant perceived diminution of the backlog of these as yet unexploited profitable synergies. In recent years, businesses often have indicated a capability to dip into this backlog for capital investments that can quickly displace labor costs should they be perceived as about to rise. This view is reinforced by securities analysts who presumably are knowledgeable about the companies they follow. This veritable army of technicians has been projecting increasingly higher five-year earnings growth, on average, since early 1995, according to I/B/E/S, a Wall Street research firm that compiles these estimates for the S&P 500. In January 1995, the analysts projected five-year earnings to rise on average by about 11 percent annually. After successive upward revisions, the March 1999 estimate was set at about 13.5 percent (profit weighted), a peak for this expansion. While there are ample data to conclude that these estimates are biased upward, there is scant evidence to suggest the bias has changed. There appears little reason to doubt that analysts’ continuous upward revisions reflect what companies are reporting to them about improved cost control, which on a consolidated basis for the economy overall, adds up to accelerating labor productivity. The alternative explanations — more rapid growth in earnings from operations abroad, a rising rate of increase in value-added prices received, or from an economy-wide prospective, ever faster labor force growth — are not credible. Thus, companies are apparently conveying to analysts that, to date, they see no diminution in expectations of productivity acceleration. This does not mean that the analysts are correct, or for that matter the companies, only that what companies are evidently telling the analysts about their productivity and profits are doubtless reflected in the longer-term profit projections. The macroeconomic data to date certainly suggest little evidence of a slowdown in productivity growth. I said the evidence for technology-driven acceleration in productivity is compelling, but not conclusive. Indeed, there are a large number of economists who doubt that the rise in productivity growth is anything more than a cyclical phenomenon or some type of statistical aberration. To be sure, they say, nonfarm productivity growth has risen in recent years, but they note that there have been other periods in our postwar records that exhibited similar patterns of acceleration only to fall back to subnormal growth. Many analysts offer as the explanation of the recent acceleration the slow pace of labor productivity growth earlier this decade. The current surge is judged a mere catch-up. The recent acceleration is admittedly not sufficient proof of an irreversible trend for a variable as statistically volatile as labor productivity. But catch-up or not, the evidence appears to be mounting that, even if productivity does not continue to accelerate, the pickup already observed does seem to explain much of the extraordinary containment of inflation despite the ever-tightening labor markets of recent years. The newer technologies, as I indicated earlier, have facilitated a dramatic foreshortening of the lead times on the delivery of capital equipment over the past decade, presumably allowing businesses to react more expeditiously to an actual or expected rise in nominal compensation costs than, say, they could have in the 1980s. The newer technologies and foreshortened lead times apparently have made capital investment distinctly more profitable, enabling firms to substitute capital for labor and other inputs far more productively than they could have a decade or two ago. Capital, as economists like to say, has deepened significantly since 1995. The surge in investment not only has restrained costs, it has also increased industrial capacity faster than the rise in factory output. The resulting slack in product markets has put greater competitive pressure on businesses to hold down prices, despite taut labor markets. Purchasing managers have been reporting virtually no material shortages for more than three years. Technology is also damping inflation through its effect on international trade, where technological developments and a move to a less constrained world trading order have progressively broken down barriers to cross-border trade. All else equal, the enhanced competition in tradeable goods enables excess capacity previously bottled up in one country to augment worldwide supply and exert restraint on prices in all countries’ markets. The resulting price discipline also has constrained nominal wage gains in internationally tradeable goods industries. As workers have attempted to shift to other sectors, gains in nominal wages and increases in prices in nontradeable goods industries may have been held down as well. The process of price containment has doubtless become, to some extent, self-reinforcing. Lower inflation in recent years has altered expectations. Workers no longer believe that large gains in nominal wages are needed to reap respectable increases in real wages. Moreover, incongruously, at the same time that labor markets tightened, workers’ fear of job skill obsolescence rose, apparently, as a consequence of the rapid changes in technology that have induced the accelerated churning in the nations’ capital stock with which our workforce must interact day by day. Job security has seemingly become more relevant than wage gains as a result. The return of experienced workers for further schooling is remarkable and attests to the surprising depth of worker job insecurity in the face of ever tightening labor markets. Because neither business firms nor their competitors can currently count any longer on a general inflationary tendency to validate decisions to raise their own prices, each company feels compelled to concentrate on efforts to hold down costs. The availability of new technology to each company and its rivals affords both the opportunity and the competitive necessity of taking steps to boost productivity. This contrasts with our experiences through the 1970s and 1980s, when firms apparently found it easier and more profitable to seek relief from rising nominal labor costs through price increases than through cost-reducing capital investments. It is difficult to judge how much of our benign overall price behavior during the past half decade is attributable to these significant shifts in the environment in which firms function. Undoubtedly, other factors have been at work as well, including those temporary oil and import price declines that I mentioned earlier and some more lasting influences that I have not discussed, such as worldwide deregulation and privatization, and the freeing up of resources previously employed to produce military products that was brought about by the end of the cold war. There also may be other contributory forces lurking unseen in the wings that will only become clear in time. Over the longer run, of course, the actions of the central bank determine the degree of overall liquidity and, hence, the rate of inflation. It is, thus, up to us at the Federal Reserve to secure the favorable inflation developments of recent years and remain alert to the emergence of forces that could dissipate them. For, at root, it has been the remarkably quiescent inflation that has provided the favorable financial conditions and stable economic environment in which businesses have been able to function most efficiently. Were that not the case, the innovations of the last decade could not have been implemented and the 1990s would surely have looked far less impressive. In the event, the performance of the American economy over the past seven years has been truly phenomenal. The breadth of technological advance and its application has engendered a major upward revaluation of business assets, both real and intangible. That revaluation has induced a spectacular rise in equity prices that to many has reached well beyond the justifiable. Of most concern is how long this remarkable period of prosperity can be extended. As I have said on previous occasions, there are imbalances in our expansion that, unless redressed, will bring this long run of strong growth and low inflation to a close. Although productivity has accelerated in recent years, the impressive strength of domestic demand, in part driven by sharply rising equity prices, has meant that the substitution of capital for labor has been inadequate to prevent us from steadily depleting the pool of available workers. This worker depletion constitutes a critical upside risk to the inflation outlook because it presumably cannot continue without eventually putting increasing pressure on labor markets and on costs. The number of people willing to work can be usefully defined as the unemployed component of the labor force plus those not actively seeking work, and thus not counted in the labor force, but who nonetheless say they would like a job if they could get one. This pool of potential workers aged sixteen to sixty-four currently numbers about 10 million, or just 5¾ percent of the corresponding population. This is the lowest such percentage on record — which begins in 1970 — and is 2½ percentage points below its average over that period. The rapid increase in aggregate demand has generated growth of employment in excess of the growth in population, causing the number of potential workers to fall since the mid-1990s at a rate of a bit under one million annually. We cannot judge with precision how far this level can decline without sparking upward pressures on wages and prices. Accelerating productivity may have appeared to break the link between labor market conditions and wage gains in recent years, but it cannot have changed the law of supply and demand. At some point, labor market conditions can become so tight that the rise in nominal wages will start increasingly outpacing the gains in labor productivity, and prices inevitably will then eventually begin to accelerate. Under those circumstances, inflation premiums embodied in long-term interest rates would doubtless rise. The attendant increased risk premiums would boost real interest rates as well, as investors become less certain about future price prospects. Thus, while rates of return on new capital equipment might remain elevated, the real cost of capital could rise enough to suppress capital expenditures and, perhaps of greater relevance to the outlook, the stock market. Our negative personal saving rate indicates that the wealth effect is alive and well. The latter has unquestionably been a key factor in the rise in domestic demand, which despite productivity improvements has exerted increasing pressure on labor markets. Thus, should equity markets retrench, consumer and business investment demands would, doubtless, weaken considerably. A more distant concern, but one that cannot be readily dismissed, is the very condition that has enabled the surge in American household and business demands to help sustain global stability: our rising trade and current account deficits. There is a limit to how long and how far deficits can be sustained, since current account deficits add to net foreign claims on the United States. It is very difficult to judge at what point debt service costs become unduly burdensome and can no longer be sustained. There is no evidence at this point that markets are disinclined to readily finance our foreign net imbalance. But the arithmetic of foreign debt accumulation and compounding interest costs does indicate somewhere in the future that, unless reversed, our growing international imbalances are apt to create significant problems for our economy. Finally, while it is reasonable to conclude that some of the gains in output per hour have been driven by fundamental forces, and are not only a cyclical phenomenon or a statistical aberration, it remains a wholly separate question of whether they can be extended. The rate of growth of productivity cannot increase indefinitely. While there appears to be considerable expectation in the business community, and possibly Wall Street, that the productivity acceleration has not yet peaked, history advises caution. As I have noted previously, history is strewn with projections of technology that have fallen wide of the mark. With the innumerable potential permutations and combinations of various synergies, forecasting technology has been a daunting exercise. There is little reason to believe that we are going to be any better at this in the future than in the past. Hence, despite the remarkable progress witnessed to date, we have to be quite modest about our ability to project the future of technology and its implications for productivity growth and for the broader economy. For, if productivity growth should level out or actually falter because additional technology synergies fail to materialize, or because output per hour has been less tied to technology in the first place, inflationary pressures could re-emerge, possibly faster than some currently perceive feasible. For, obviously if productivity growth slows, unit labor costs would rise, first pressuring profit margins, and then prices. Indeed, we cannot rule out such a process if labor productivity growth simply levels out. Our ability to forecast, when a diminishing pool of potential workers begins to raise costs or when productivity trends change, is limited. We do know, however, that if, and when, either materializes, inflation pressures are likely to again surface. To return to my opening question: can we project how long the economy of the United States can act as a buffer to weakness elsewhere? The answer: not easily. History counsels us that sharp changes in direction are rarely, if ever, anticipated. Indeed, were these changes readily apparent, presumably, businesses would adjust to that anticipation and, hence, significantly damp the cyclical tendencies in the economy. The outlook for the American economy is particularly relevant to the realization of a full recovery of East Asia. To be sure, there are definite signs of activity bottoming out in Indonesia and Hong Kong and evidence of some gains in Thailand and Malaysia, with the most progress reported in Korea. Japan, whose economy is considerably larger than the rest of East Asia combined, excluding China, is still bedeviled by its inability to restore a vibrant banking system, though they seem to be making some progress. But the emergence of East Asia out of its severe crisis, though real, remains fragile. The very improvements now under way could be threatening the resolve of a number of countries to adhere to the disciplined plans that have been instrumental in their recovery to date. Brazil has managed to stem a prospective implosion that followed in the wake of its currency crisis. But there, as well as some other parts of Latin America that seem to have dodged the bullet of a Brazilian-induced contagion, the potential for a letdown in their policy discipline that has served them well, also is a concern. But, in general, discipline is likely to hold, because the lessons of 1997 and 1998 are too recent and vivid to be soon forgotten. Hence, with a little backing and filling, the emerging nation crises of the last two years are likely to gradually dissipate and these countries should move onto a significant recovery path. The overhang of debt and difficult unresolved structural problems, however, are likely to keep a vigorous recovery at bay. But, in the end, their outlook will be influenced importantly by developments in Japan, Europe, and especially the United States. In Europe, gains in real GDP have remained modest, though inflation appears nonexistent. Arguably, the rapidity of the introduction of cutting-edge technologies has not seemed to be as evident in Europe as in the United States. Though somewhat puzzling, this is surely temporary, unless the thrust of innovation in the United States comes to an unexpected halt, or existing rigidities in European markets unexpectedly persist in the face of growing international competition. Europe, as a consequence, is likely to remain a positive contributor to world economic stability in the years ahead. Let me conclude with an observation I have made before: We policymakers have been engaged in a lot of on-the-job training in recent years. The remarkable American economy, whose roots are still not conclusively known, and the Asian crises that caught us by surprise, among other humbling experiences, have made policymakers particularly sensitive to how fast the world can shift beneath our feet. We need to be alert to the dramatic changes that are continuously confronting us, but recognize that neither the fundamental laws of economics, nor human nature, on which they are based, has changed, or is likely to change. This will be an especially important notion to keep in mind as we continue to grapple with the rapidly changing global economic environment and its regulatory structure, which this symposium has been convened to address.
|
board of governors of the federal reserve system
| 1,999 | 5 |
Speech by Alice M Rivlin, Vice Chair of the Board of the US Federal Reserve System, at Minneapolis, Minnesota, on 13 May 1999.
|
Ms Rivlin’s view on sustaining economic growth and development in the United States Speech by Alice M Rivlin, Vice Chair of the Board of the US Federal Reserve System, at Minneapolis, Minnesota, on 13 May 1999. Everyone seems to be talking about the spectacular performance of the U.S. economy these days and wondering how long it can stay so good. Will our economy go on performing so splendidly into the next millennium or will the good times come to an end sometime soon? Indeed, the economic journalists’ most frequent question these days is some variant of: “How long will it last?” “Is the good news temporary or permanent?” I get this question all the time, as do all members of the Federal Open Market Committee. Hope springs eternal that we may be privy to some secret economic clues available only to readers of entrails inside the temple. Unfortunately, of course, we process the same economic information available to everyone else and face the same uncertainties. I don’t know the answer to how long the good times will roll, and I don’t know any responsible person who claims to know. Indeed, I think it’s a silly question. So I’d like to formulate a more important query, one that I think has more operational significance. The operational question is this: Given what we know–or think we know–about why the economy might be performing so extraordinarily effectively, what can we all do that will likely help to keep the good news flowing? Answering this question will require: (1) sorting out the list of factors that appear to be contributing to the good performance into those we may be able to control or influence and those we can’t; (2) doing everything we can to influence the controllable factors so that we maximize the chances that the good performance continues. By “ we” I don’t just mean the Federal Reserve, although the Fed has a role to play. I mean everyone in this room and people like you in all parts of the country. We all make economic policy decisions at the household, business, community or governmental level, directly and through our actions, and influence as consumers, investors, producers and voters. Sustaining the high performance of the economy depends much more on what people in Minnesota do– along with those in Maine and Louisiana and Oregon–than it does on monetary policy. What do we mean when we say the economy is performing well? The statistics usually cited are the growth rate of gross national product, the unemployment rate and the rate of inflation (for which there exist several different measures). All of these statistics have been coming up roses for some time–simultaneously. The GDP (in real terms, after inflation) has been growing continuously for eight years and this long expansion, instead of petering out, has accelerated in the last couple of years. Real GDP grew about 4.5 percent in the first quarter of this year and has grown about 4 percent a year for the last two years, despite the negative impact of the world financial crisis that began in Asia in early 1997. Millions of new jobs have been created in the last few years; and unemployment, now at 4.3 percent, has been at or below 5 percent for over two years. Not long ago, most economists thought growth this rapid and unemployment this low would inevitably produce inflation. However, inflation has remained remarkably subdued, and, indeed has continued to decline over the last several years. All of this means rising standards of living and greater economic security for most Americans. Moreover, the strength of the economy has been spread broadly across all regions of the country. A few areas with deep-seated economic problems, such as Northern Minnesota, still lag, and parts of agriculture are suffering severely from the downward pressure of weak world demand on commodity prices, but growth in other sectors has taken up much of the slack. Low inflation has reduced expectations of inflation, lowered long-term interest rates, encouraged investment and housing and made planning ahead easier for everyone. Moreover, in the last couple of years, one of the discouraging aspects of this long expansion finally seems to be reversing. Through 1996, the benefits of rising prosperity were flowing to those with skill and education, people already doing relatively well. Those at the low end of the skill and income ladder were falling further behind. Recently, however, very tight labor markets have meant that even unskilled and less educated workers have enjoyed higher real earnings, poverty rates have begun to decline, and even the bottom 20 percent of households have had a significant increase in their standard of living for a change. The high performance of the economy does not just mean more material possessions for most people, although it certainly does mean that. It also means more enjoyment of and support for the creative side of life–art, music, theatre, dance–and more public resources for improving streets, parks and schools, cleaning up pollution and preserving natural beauty. The City of Chicago, for example, is putting $2 billion into renovating its crumbling schools. Perhaps most important, strong growth and tight labor markets have opened opportunities for workers at all levels–opportunities to move up, to get better jobs, to get off welfare, to go back to school with less fear of being unable to find jobs on completion. And it isn’t just the United States that benefits from our prosperity–indeed, the vibrant American economy is contributing mightily to the revival of growth in what would otherwise be a weak global economy. Asia is only beginning the process of recovery from the deep recession that followed the financial crisis of 1997, and the hoped-for engine of Asian recover–the huge Japanese economy–remains stalled. Much of Latin America is in recession as a result of the Brazilian financial crisis and low commodity prices; Russia is struggling and exporting its troubles to its trading partners; even Western Europe, affected by all of the above, is growing far more slowly than we are. When Americans are prosperous, we spend an increasing portion of our rising incomes on imports. Without the income generated by exporting to us, many other countries would be in much worse shape than they actually are. In the long run, the rapid accumulation of obligations to foreigners, which pays for our excess of imports over exports, cannot be sustained. In the meantime, however, we are financing some of our boom-times by borrowing the savings of foreigners, and they are benefiting by exporting to us. A slowdown in the U.S. economy would reduce our trade and current account deficits, but make it harder for the rest of the world to recover. The bottom line is, billions of people, not just us, benefit from the high performance of the U.S. economy, so what can we do to keep it functioning so well? The key conundrum is how we have managed to have such strong growth and associated tight labor markets, with subdued, even declining, inflation. Some of the credit clearly goes to global forces that Americans don’t control and which can’t be counted on to continue pushing in the same direction forever. The world financial crisis has weakened demand for internationally traded goods, especially commodities, including agricultural products, metals, and oil (although the oil price has come back up recently as producing countries cut supply). In the face of weak demand, world prices of just about everything tradable are low and competition is fierce. At the same time, the U.S. dollar has been strong, especially in relation to currencies of emerging market countries. In part, the strength of the dollar reflects the fact that our economy has been performing so well and has offered international investors profitable opportunities along with a safe haven from political and financial turmoil. The strong dollar has made imports cheap and has helped keep inflation low-to the benefit of U.S. consumers and the distress of some U.S. producers. We can’t count on low world commodity prices or a strong dollar continuing far into the future, and both have their downsides for us and for others. Some features of the global economy, however, seem likely to endure. The events of the last two years come on top of a huge expansion of global trade, competition and productive capacity–all of which have given American businesses less control over prices in the face of rising costs and benefited American consumers through low prices. Moreover, although we often think of “global competition” as being associated with internationally traded goods, many of the same forces that have made the global marketplace increasingly competitive–especially faster, cheaper transportation and the revolutionary changes in communications and information management–also operate to make domestic markets broader and more competitive. As buying, selling, and comparing prices at a distance has become easier, producers of all kinds of goods and services have found themselves in a more competitive environment with less independent pricing power than they used to have. The increasing competitiveness of the U.S. economy–and that of the rest of the world–seems likely to continue to reduce inflationary tendencies in the future. Further deregulation would foster this competitiveness, as would further lowering of trade barriers. Conversely, reregulation or a relapse into protectionism would tend to negate the procompetitive and antiinflationary trend. Fierce national and international competition has its cost in uncertainty, disruption of lives and settled patterns. It demands flexibility and adaptability of businesses, workers and communities. It requires a willingness of workers to learn new skills, to take chances, to move to new jobs and new places. It requires nimbleness, flexibility and risk taking on the part of business. It requires communities to be adaptable, to make efforts (not all of which will be successful) to diversify their economic bases and to attract new jobs and residents when existing ones move on. It requires government to be imaginative and creative (words we don’t always associate with government) in designing transition assistance for workers and communities that will provide incentives to change and adapt, rather than incentives to remain stuck in the status quo. Strong growth and tight labor markets reduce the likely cost of taking chances for workers, businesses and communities. But change is hard and not everyone is willing or able to pay the price. One of the determinants of how long the U.S. economy can continue to perform at a high level with low inflation is how well we all learn to adapt to the changes that go with the increased competitiveness of the national and international marketplace that is helping to keep our economy growing and inflation low. An important part of the answer to why the U.S. economy has been able to grow so strongly without inflation accelerating has been the recent resurgence in productivity growth. Productivity grew strongly from the end of World War II into the early 1970s, accounting for the rapid increase in American’s real incomes over that period. Then, around the time of the first oil shock in the early 1970s, productivity growth slowed drastically both here and in other industrial countries, remaining weak in the U.S. through the 1980s and first half of the 1990s. Average growth in output per hour in non-farm business was only a little over 1 percent between 1973 and 1995. In the last three years, however, productivity growth has accelerated, reaching an astonishing 4 percent growth in the last two quarters. The growth in productivity has enabled business to pay higher wages without raising prices significantly or eroding profits severely. Is the increase in productivity a temporary cyclical response to strong growth or does it presage an upward shift in the productivity growth trend that will make it easier going forward to continue strong growth, rising real incomes and low inflation? That’s another one of those silly questions to which no one can honestly claim to know the answer. Hypotheses abound about why productivity growth might have accelerated in the mid 1990s. Technology clearly has something to do with it. Although the telecommunications and information management revolution did not start in the 1990s, it may have been just at the right point by then to offer firms that are facing strong demand and tight labor markets a way to increase their efficiency. In the past, when unemployment remained low for an extended period, economists expected productivity to suffer because firms were forced to hire less skilled workers with less experience–workers whose productivity was likely to be low. However, recent experience suggests–but certainly does not yet prove–that two factors may have combined to change the expected impact of tight labor markets on productivity. One factor is the availability of new technology, especially computers and telecommunications technology. The other factor is the revolution in management attitudes and practices that has occurred since the 1980s. A whole generation of managers has been trained to think continuously about productivity and quality management. Buzzwords like reengineering and restructuring have not only gotten into the vocabulary of managers, but they have also infiltrated their thought processes and affected their behavior. The response of many firms to shortages of skilled workers and to foreign competition has apparently been to reorganize what they were doing and how they were doing it, substituting efficient new equipment for employees, training workers to use new equipment and techniques, and outsourcing to reduce costs. All of these have combined to increase productivity. If this hypothesis is right–or even partly right–then there are actions that many different kinds of economic actors can take to help keep a good thing going. • • • Workers and potential workers can acquire more skills and more education at all levels. They can take the risk of moving to new and better jobs or starting new businesses. Companies can keep their focus on cutting costs and increasing productivity; they can foster research and innovation; they can offer training and employee incentives to acquire more education and skills. Colleges and universities, community colleges and technical institutes can offer courses at times and in places convenient for workers, for “non-traditional” students as well as traditional ones. They can put less emphasis on standard degrees and more on courses designed to help students acquire not just job skills, but the broader education that helps • • • • • them to become more adaptable and to acquire the intellectual self–confidence that helps them deal with change. Communities, large and small, can make efforts to diversity their economic bases, upgrade and modernize their school systems, and welcome new kinds of workers and companies. Governments can support the basic research from which applications and innovations flow, and offer incentives to education and training and financial aid for students. Financial institutions can offer attractive convenient ways for individuals to save and invest. They can take chances on new ideas, new clienteles, and innovative companies– subject to sound risk management, of course. All levels of government can manage their finances prudently, provide public services efficiently, build up surpluses and rainy day funds and pay down debt. This kind of behavior will result in government adding to the national saving rate and enhancing the chances of future growth, rather than subtracting from national saving by running public deficits. It is especially important to use the current and projected federal budget surplus to reduce the federal debt. The President’s budget and social security proposal would have this effect. Finally, the Federal Reserve can continue trying, as we have been doing, to balance the risk of allowing the economy to grow too fast for its own long-run good and begin a growth-threatening acceleration of inflation against the other risk that the economy will slowdown too much in the near term and lose the benefits of healthy labor markets and sustainable growth. In short • No one knows how long the good news will last. • But there is a whole long list of actions that can be taken by workers, businesses, communities, governments, educational institutions and others that can, if taken together, increase the chances that this remarkable combination growth and low inflation will continue for a while. If we all do them together, we and our children will have a better chance of living in a world where change is normal but most of the changes are for the better.
|
board of governors of the federal reserve system
| 1,999 | 5 |
Testimony by the Chairman of the Board of Governors of the US Federal Reserve System, Alan Greenspan, before the Committee on Banking and Financial Services of the US House of Representatives on 20 May 1999.
|
Mr Greenspan offers some suggestions to improve the international financial architecture Testimony by the Chairman of the Board of Governors of the US Federal Reserve System, Alan Greenspan, before the Committee on Banking and Financial Services of the US House of Representatives on 20 May 1999. Mr Chairman, Mr. LaFalce, and Members of the Committee, we at the Federal Reserve are in broad agreement with the approach outlined by Secretary Rubin, and expect to continue to work closely with the Treasury in this area. As I have indicated previously before this committee, dramatic advances in computer and telecommunications technologies in recent years have enabled a broad unbundling of risks through innovative financial engineering. The financial instruments of a bygone era, common stocks and debt obligations, have been augmented by a vast array of complex hybrid financial products that has led to a far more efficient financial system. These same new technologies and financial products, however, have challenged the ability of inward-looking and protectionist economies to maintain effective barriers, which, along with the superior performance of their more open trading partners, has led, over the past decade, to a major dismantling of impediments to the free flow of trade and capital. The new international financial system that has evolved as a consequence has been, despite recent setbacks, a major factor in the marked increase in living standards for those economies that have chosen to participate in it. Notwithstanding the demonstrable advantages of the new international financial system, the Mexican financial breakdown in late 1994 and, of course, the most recent episodes in East Asia and elsewhere have raised questions about the inherent stability of this new system. These newly open markets were exposed to a huge expansion in capital inflows that their economic and financial systems were not ready to absorb. These flows in turn were engendered by the increasing diversification out of industrial country investment portfolios, augmented by huge capital gains through 1997. Net private capital inflows into emerging markets roughly quadrupled between 1990 and the onset of the Asian crisis. Such diversification was particularly directed at those economies in Asia that had been growing so vigorously through the 1970s, 1980s, and into the 1990s – the so-called “Asian tigers.” In the event, these economies were ill prepared to absorb such volumes of funds. There were simply not enough productive investment opportunities to yield the returns that investors in the West were seeking. It was perhaps inevitable then that the excess cash found its way in too many instances into ill conceived and unwisely financed real estate ventures. What appeared to be a successful locking of currencies onto the dollar over a period of years in East Asia, led, perhaps inevitably, to large borrowings of cheaper dollars to lend, unhedged, at elevated domestic interest rates that reflected unheeded devaluation risk premiums. When the amount of such unhedged dollar borrowings finally became excessive, as was almost inevitable, the exchange rate broke. While it might seem that the consequences were easily discernible, they were not. Problems with imprudently financed real estate investments emerge with chronic frequency around the globe without triggering the size of the collapse experienced in East Asia in 1997. The size of the crisis became evident only when the normal buffers that any economy builds up to absorb shocks were, in the case of the East Asian economies, so readily breached under pressure. It has taken the longstanding participants in the international financial community many decades to build sophisticated financial and legal infrastructures that buffer shocks. Those infrastructures discourage speculative attacks against a well entrenched currency because financial systems are robust and are able to withstand the consequences of vigorous policy responses to such attacks. For the newer participants in global finance, their institutions, until recently, had not been tested against the rigors of major league pitching, to use a baseball analogy. The heightened sensitivity of exchange rates of emerging economies under stress would be of less concern if banks and other financial institutions in those economies were strong and well capitalized. Developed countries’ banks are highly leveraged, but subject to sufficiently effective supervision both by counterparties and regulatory authorities, so that, in most countries, banking problems do not escalate into international financial crises. Most banks in emerging market economies are also highly leveraged, but their supervision often has not proved adequate to forestall failures and a general financial crisis. The failure of some banks is highly contagious to other banks and businesses that deal with them, as the Asian crisis has so effectively demonstrated. This weakness in banking supervision in emerging market economies was not a major problem for the rest of the world prior to those economies’ growing participation in the international financial system over the past decade or so. Exposure of an economy to shortterm capital inflows, before its financial system is sufficiently sturdy to handle a large unanticipated withdrawal, is a highly risky venture. It thus seems clear that some set of suggested standards that countries should strive to meet would help the new highly sensitive international financial system function effectively. There are many ways to promote such standards without developing an inappropriately exclusive and restrictive club of participants. For example, in any set of standards there should surely be an enhanced level of transparency in the way domestic finance operates and is supervised. This is essential if investors are to make more knowledgeable commitments and supervisors are to judge the soundness of such commitments by their financial institutions. A better understanding of financial regimes as yet unseasoned in the vicissitudes of our international financial system also will enable counterparties to more appropriately evaluate the credit standing of institutions investing in such financial systems. There should be no mechanism, however, to insulate investors from making foolish decisions, but some of the ill-advised investing of recent years can be avoided in the future if investors, their supervisors, and counterparties, are more appropriately forewarned. To be sure, counterparties often exchange otherwise confidential information as a condition of a transaction. But broader dissemination of detailed disclosures by governments, financial institutions, and firms is required if the greater risks inherent in our vastly expanded global financial structure are to be contained. A market system can approach an appropriate equilibrium only if the signals to which individual market participants respond are accurate and adequate to the needs of the adjustment process. Product and asset prices, interest rates, debt by maturity, and detailed accounts of central banks and private enterprises are among the signals so essential to the effective functioning of a global economy. I find it difficult to believe, for example, that the crises that arose in Thailand and Korea would have been nearly so virulent had their central banks published data prior to the crises on net reserves instead of the not very informative gross reserve positions only. Some inappropriate capital inflows would almost surely have been withheld and policymakers would have been forced to make difficult choices more promptly if earlier evidence of difficulty had emerged. As a consequence, the G-10 central banks and the IMF initiated an effort to establish standards for disclosure of on- and off-balance-sheet foreign currency activities of the public sector by countries that participate, or aspire to participate, in international capital markets. The focus of this work was the authorities’ foreign currency liquidity position, which consists of foreign exchange resources that can be easily mobilized, adjusted for potential drains on those resources. This work was part of a larger effort to enhance disclosure of a broader set of economic and financial data under the IMF Special Data Dissemination Standard. Such transparency suggests a second standard worth considering. Countries that lack the seasoning of a long history of dealing in international finance should manage their external assets and liabilities in such a way that they are always able to live without new foreign borrowing for up to, for example, one year. That is, usable foreign exchange reserves should exceed scheduled amortizations of foreign currency debts (assuming no rollovers) during the following year. This rule could be readily augmented to meet the additional test that the average maturity of a country’s external liabilities should exceed a certain threshold, such as three years. This could be accomplished directly, or through the myriad innovations to augment maturities through rollover options. The constraint on the average maturity ensures a degree of private sector “burden sharing” in times of crisis, since in the event of a crisis, the market value of longer maturities would doubtless fall sharply. Clearly few, if any, locked-in holders of long-term investments could escape without significant loss. Short-term foreign creditors, on the other hand, are able to exit without significant loss as their instruments mature. If the preponderance of a country’ s liabilities are short term, the entire burden of a crisis would fall on the emerging market economy in the form of a run on reserves. Some emerging market countries may argue that they have difficulty selling long-term maturities. If that is indeed the case, their economies are being exposed to too high a risk generally. For too long, too many emerging market economies have managed their external liabilities so as to minimize their current borrowing cost. This short-sighted approach ignores the insurance imbedded in long-term debt, insurance that is almost always well worth the price. Adherence to such a rule is no guarantee that all financial crises can be avoided. If the confidence of domestic residents is undermined, they can generate demands for foreign exchange that would not be captured in this analysis. But controlling the structure of external assets and liabilities nonetheless could make a significant contribution to stability. Considerable progress has been made in recent years in developing sophisticated financial instruments. These developments create added complexity that all financial market participants, including policymakers from emerging market economies, must manage. However, they also create opportunities that emerging market economies should seek to exploit. In doing so there are lessons they can learn from advances in risk management strategies developed by major financial institutions. To the extent that policymakers are unable to anticipate or evaluate the types of complex risks that the newer financial technologies are producing, the answer, as it always has been, is less leverage, i.e. less debt, more equity, and, hence, a larger buffer against adversity and contagion. A third standard could be a legal infrastructure that enables the inevitable bankruptcies that will occur in today’ s complex world to be adjudicated in a manner that minimizes the disruption and contagion that can surface if ready resolutions to default are not available. A fourth standard is the obvious necessity of sound monetary and fiscal policies whose absence was so often the cause of earlier international financial crises. With increased emphasis on private international capital flows, especially interbank flows, private misjudgments within flawed economic structures have been the major contributors to recent problems. But inappropriate macropolicies also have been a factor for some emerging market economies in the current crisis. There are, of course, numerous other elements of sound international finance that are worthy of detailed consideration, but the aforementioned would constitute a good start. Even so, improvements in transparency, commercial and legal structures, as well as supervision cannot be implemented quickly. Such improvements and the transition to a more effective and stable international financial system will take time. The current crisis, accordingly, has had to be addressed with ad hoc remedies. It is essential, however, that those remedies not conflict with a broader vision of how our new international financial system will function as we enter the next century.
|
board of governors of the federal reserve system
| 1,999 | 5 |
Speech by the Chairman of the US Federal Reserve System, Mr Alan Greenspan, at the Conference on International Business, organised by the Alliance for the Commonwealth in Boston, Massachusetts on 2 June 1999.
|
Mr Greenspan’s statement on trade, technology and economic progress Speech by the Chairman of the US Federal Reserve System, Mr Alan Greenspan, at the Conference on International Business, organised by the Alliance for the Commonwealth in Boston, Massachusetts on 2 June 1999. I am pleased to join the Alliance for the Commonwealth today for its annual Conference on International Business. I should also like to thank my good friends John LaWare, Chairman of the Alliance, and Cathy Minehan, President of the Boston Federal Reserve Bank, who encouraged me to participate in this daylong educational symposium. There are few better examples of the benefits of trade and investment than here in Boston where the roots of international trade go back to our nation’s origins. Massachusetts, and New England in general, are today, as in the past, actively engaged in the global trade environment and are currently home to many of the firms that are the recipients or the sources of crossborder direct investment. The evidence is overwhelmingly persuasive that the massive increase in world competition – a consequence of broadening trade flows – has fostered markedly higher standards of living for almost all countries who have participated in cross-border trade. I include most especially the United States. And yet, as I have indicated in recent speeches and intend to outline shortly, there are reasons to be concerned that the benefits of increasingly open trade may not be allowed to be as readily forthcoming in the future as they have been in the past half century. Although many forces have been at play, the post World War II surge in trade has clearly owed, in large part, to significant advances in technological innovation. Since the dawn of the industrial revolution, there has been an inexorable drive to leverage physical brawn and material resources into ever greater value added or output. New insights into the laws of nature brought steam and later electric power. The development of a production quality level that facilitated interchangeable parts brought assembly line production. And the development of railroads facilitated the evolution of mass markets. Almost all of the leverage of the physical to higher value added has reflected the substitution of ideas – new insights – for material bulk and brute human effort. The resulting more effective organization of material has, of course, inevitably meant that less of it was needed per unit of output. The insights of metallurgy and architectural and engineering design, for example, enabled the construction of buildings that use far less physical material per unit of space than, say, a half century ago. The insights that led to central heating, as well as synthetic fiber, facilitated reduced clothing weight, while the development of the jet engine brought far greater annual passenger miles per unit of aircraft size. But doubtless it has been the advent in recent decades of the synergies of the microprocessor, lasers, and fiber optics that has fostered a distinct quickening in the displacement of physical weight of output with concepts. The ability to miniaturize transistor electronic circuits has displaced huge tonnages of copper and enhanced the speed of calculation that the miniaturization of circuitry facilitated. As high tech became an increasing part of our national product, the relative physical dimensions of our value added fell dramatically. The per capita physical weight of our gross domestic product is evidently only scarcely higher today than it was fifty or one hundred years ago. By far the largest contributor to growth of our price-adjusted GDP, or value added, has been ideas – insights that leveraged physical reality. The consequent downsizing of output, of course, meant that products were easier, and hence less costly, to move, and most especially across national borders. It is thus not surprising that the price-adjusted level of international trade, as I indicated earlier, has expanded at a far faster pace than gains in real domestic demand. Imports of goods and services as a percent of gross domestic products worldwide, on average, have risen from approximately 14 percent twenty-five years ago to 24 percent today. The growth in physical weight of such trade, as with the national product generally, has been far less. For example, United States data on both exports and imports indicates that the priceadjusted value of our trade per pound has risen by approximately 4 percent per year on average over those same three decades. But technology has augmented international trade for reasons beyond the downsizing of material output. New telecommunications technologies made it very difficult for the autarchic societies of the former Soviet Union to sustain their isolation in the face of the growing relative affluence of the West. News could no longer be bottled up. Even in the West, the stultification of protectionism became increasingly evident as new consumer products entered the world markets en masse. The political pressures to deregulate moribund industries and open up borders to trade soon became irresistible. The international trading system that evolved has enhanced competition and fostered the continuous scrapping of old technologies to make way for the new. Standards of living rise because the depreciation and other cash flows of industries employing older, increasingly obsolescent, technologies are marshalled to finance the newly produced capital assets that almost always embody the cutting edge technologies. This is the process by which wealth is created incremental step by incremental step. It presupposes a continuous churning of an economy in which the new displaces the old. But there is also little doubt that this transition to the new high-tech economy, of which rising trade is a part, is proving difficult for a large segment of our workforce that interfaces with our rapidly changing capital stock day-by-day. Moreover, while major advances in standards of living are evident among virtually all nations that have opened their borders to increased competition, the adjustment trauma has also distressed those who once thrived in companies that were then at the cutting edge of technology, but which have since become increasingly less competitive in both domestic and foreign markets. Economists will say that workers should move from the steel districts of western Pennsylvania to a vibrant Silicon Valley. And eventually they, or more likely their children, will. But the adjustment process is wrenching to an existing workforce made redundant largely through no fault of their own. It may be argued that all workers should have the foresight to recognize long-term job opportunity shifts and move in advance of obsolescence. This regrettably is a skill not in great abundance – among business managers or the economists who counsel them, as well as among workers. Yet the protectionist propensity to thwart the process of the competitive flow of capital, from failing technologies to the more productive, is unwise and surely self-defeating. History tells us that not only is it unwise to try to hold back innovations, it is also not possible over the longer run. Generation after generation has experienced episodes in which the technologically obsolescent endeavored to undermine progress, often appealing to the very real short-term costs of adjusting to a changing economic environment. From the Luddites to the Smoots and the Hawleys, competitive forces were under attack. In the end they did not prevail and longterm advances in standards of living resumed. Nonetheless, the campaign to expand free trade is never won. Legislation to further lower trade barriers, for example, is becoming increasingly more difficult to pass in our Congress. It is a continuing battle. Further, while tariffs in industrial countries have come down sharply over the past half century, other barriers have become more prevalent. Administrative protection in the form of antidumping suits and countervailing duties is a case in point. While these forms of protection have often been imposed under the label of promoting “fair trade,” oftentimes they are just simple guises for inhibiting competition. Typically, antidumping duties are levied when foreign average prices are below average cost of production. But that also describes a practice that often emerges as a wholly appropriate response to a softening in demand. It is the rare case that prices fall below marginal cost, which would be a more relevant standard. Antidumping initiatives should be reserved, in the view of many economists, for those cases where anticompetitive behavior is involved. Contrary to popular notions about antidumping suits, under U.S. and WTO law, it is not required to show evidence of predatory behavior, or intention to monopolize, or of any other intentional efforts to drive competitors out of business. In the end it is clear that all economic progress rests on competition. It would be a great tragedy were we to stop the wheels of progress because of an incapacity to assist the victims of progress. Our efforts should be directed at job skills enhancement and retraining – a process in which the private market is already engaged. Thwarting competition, by placing barriers to imports, will prevent the needed transitions of the productive capital stock of the United States and other nations that enable it to continuously concentrate on producing those goods and services most desired by consumers. Protectionism will also slow the inevitable transition of the workforce to more productive endeavors. To be sure, an added few years may enable some workers to reach retirement with dignity, but it will also keep frozen in place younger workers whose better job opportunities decline with time. I regret that trade policy has been inextricably linked with job creation. We try to promote free trade on the mistaken ground that it will create jobs. The reason should be that it enhances standards of living through the effects of competition on productivity. It is difficult to find credible evidence that trade has impacted the level of total employment in this country over the long run. Indeed, we are currently experiencing the widest trade deficit in history with a level of unemployment close to record lows. Certainly, the distribution of jobs by industry is affected by international trade, but it is also affected by domestic trade. It is the relative balance of supply and demand in a competitive market economy that determines the mix of employment. When exports fall or imports rise, domestic demand and relative prices have invariably adjusted in the long run to leave total employment relatively unaffected. As economists like to say, all imports are eventually paid for with exports. I also regret that, despite the remarkable success over a near half century of GATT, the General Agreement on Trade and Tariffs, and its successor, the World Trade Organization, in reducing trade barriers, our trade laws and negotiating practices are essentially adversarial. They presume that a trade concession extracted from us by our trading partners is to their advantage at our expense, and must be countered. Few economists see the world that way. And I am rash enough to suggest that we economists are correct, at least in this regard: trade is not a zero sum game. If trade barriers are lowered by both parties, each clearly benefits. But if one lowers barriers and the other does not, the country that lowered barriers unilaterally would still be better off having done so. Raising barriers to achieve protectionist equality with reluctant trading partners would be neither to our benefit, nor to theirs. The best of all possible worlds for competition is for both parties to lower trade barriers. The worst is for both to keep them up. For these reasons, I am concerned about the recent evident weakening of support for free trade in this country. Should we endeavor to freeze competitive progress in place, we will almost certainly slow economic growth overall, and impart substantial harm to those workers who would otherwise seek more effective longer-term job opportunities. Protecting markets from new technologies has never succeeded. Adjustments to newer technologies have been delayed, but only at significant cost. Even should our trading partners not retaliate in the face of increased American trade barriers, an unlikely event, we do ourselves great harm by lessening the vigor of American competitiveness. The United States has been in the forefront of the postwar opening up of international markets, much to our, and the rest of the world’s, benefit. It would be a great tragedy were that process reversed. I do not believe that will be allowed to happen. There is too much at stake for us and our trading partners.
|
board of governors of the federal reserve system
| 1,999 | 6 |
Remarks by Mr Laurence H Meyer, member of the Board of Governors of the US Federal Reserve, at the Conference of State Bank Supervisors, Williamsburg, Virginia on 3 June 1999.
|
Mr Meyer’s remarks at the Conference of State Bank Supervisors Remarks by Mr Laurence H Meyer, member of the Board of Governors of the US Federal Reserve, at the Conference of State Bank Supervisors, Williamsburg, Virginia on 3 June 1999. Moving Forward into the 21st Century It is a pleasure to be here, when — once again — the industry has enjoyed another year of strong performance. That is not to say bank supervision and regulation today is without its challenges, or that there are few risks to U.S. banks. Perhaps to the contrary, as a result of market dynamics, both domestically and abroad, we can expect to see some loan losses and earnings pressures. Some weaknesses have already surfaced. For the most part, though, we continue to be spared the crisis events that can be so disruptive. This “lull” in domestic financial problems has come at an opportune time, as U.S. and world financial systems adjust to the profound changes of the last decade. The number of insured commercial banks continues to decline — down 4 percent last year and by roughly one-third since the decade began. Meanwhile, the scope and pace of financial innovation continues to expand, making many transactions increasingly difficult to manage and more opaque. Risk management and measurement techniques throughout the industry have become much more quantified with greater integration of information systems and financial theory. Large banks, in particular, have also become far more diversified, now that they can expand nationwide. These changes and the industry’s transition, in general, should be viewed as a continuous and natural response by banks to evolving market needs and new technologies. That the industry, itself, is changing is not a problem; banks must adapt to survive. The fact that the industry, rather than the legislative or regulatory process, is leading the change is also appropriate; the private sector should almost always show the way. The result, though, must be managed with care. During the 1990s, banking organizations have increased tremendously in size as a result of the consolidation process, and the complexity of many bank activities has grown as well. These developments have crucial implications for bank supervisors, including those pertaining to systemic risks. In many respects, they have also made bank supervision more difficult. We have not yet achieved “financial modernization” in terms of legislation, but we certainly have a far different banking and financial industry than existed a decade ago. Undoubtedly, more change is on the horizon, as distinctions among financial institutions continue to erode. That fact simply underscores the need for Congress to modify U.S. banking laws and permit the regulatory environment to catch up with market events. Meanwhile, bank supervisors and regulators should remain focused on their principal tasks. First, to ensure that the banking system remains sufficiently safe and sound, posing little risk to the federal safety net and adequately protected against systemic risk. Second, to ensure that the industry continues to provide the American public with a full range of competitively priced banking services and conforms to legislative standards of competitiveness. Perhaps more than before, achieving these goals requires us to adapt our practices to changing circumstances within the banking industry and to take full advantage of the technologies that exist. Clearly, as the industry has changed, so has bank supervision. As banks expanded nationwide, state and federal supervisors worked together, producing the interstate supervisory protocol that provides a more seamless oversight process for state chartered banks. As banks grew larger and more complex, we focused more on risk management practices and controls and less on a bank’s condition at a point in time. We also became more risk focused in our overall supervisory approach, emphasizing those activities that presented the greatest risks. As financial innovation and capital arbitrage took hold, we also became more aware of the need to update regulatory capital standards and to make greater use of market discipline. We are pursuing our objectives both domestically among ourselves and abroad through the Basel Committee on Bank Supervision, under the auspices of the Bank for International Settlements in Basel, Switzerland. We are designing a way forward, building upon the “three pillars” approach outlined in a consultative document released today by the Basel Committee, a subject I will return to in a moment. This approach encompasses (1) a strong, risk-sensitive regulatory capital standard; an active supervisory program; and (3) improved bank disclosures that allow the marketplace to evaluate an institution’s risk posture and to reward or discipline it appropriately. In my remarks today, I would like to address many of these and other points, with particular emphasis on the supervisory process and how we at the Federal Reserve are adapting to change. At the outset, I would emphasize that bank supervision is, by its nature, a dynamic process. Our practices must constantly improve or they will become quickly outdated. Supervisors must also be flexible, both in their application of supervisory techniques to banks and in their expectations regarding what practices individual banks should follow. Perhaps more so than any other, the U.S. banking system is highly diverse, with its thousands of small community banks and a small number of increasingly large, highly complex, internationally active institutions accounting for a growing share of total bank assets. Neither a single supervisory approach, nor a single risk management technique will work for all. That need for flexibility and adaptation has been well served by our dual banking system and by the ability of individual states and state chartered banks to innovate. Large and Complex Banking Organizations One aspect of supervision that has become more crucial to our oversight process relates to systemic risk and to the activities of our largest banking organizations. A decade ago, for example, the 20 largest U.S. banking organizations held 68 percent of the assets of the 50 largest bank holding companies; now its 82 percent. Then, the 20 largest holding companies held 37 percent of all U.S. commercial bank assets; now that figure has risen to 64 percent. Those figures conceal, of course, the dramatic increase in the complexity of their activities represented by securitizations and derivative products. The notional value of derivative and futures contracts held by U.S. banks now exceeds $33 trillion, nearly five times the level at the end of 1990. Securitizations by U.S. banks, at $270 billion, have grown as fast and are expanding beyond consumer-based loans, such as credit card and auto loans, to commercial credits. Virtually all of these securitization and derivative activities are concentrated among the largest banks. While notional values and amounts securitized say almost nothing about the level of underlying risk to individual banks, they speak strongly to the increased volume and complexity of large bank activities and of the somewhat hidden risks they face. For these organizations, balance sheets and traditional lending have much different meanings from a decade ago. Last year, the Federal Reserve responded to this trend by sharpening its supervisory focus on a smaller number of large complex banking organizations, both domestic and foreign. We now give increased attention to roughly twenty U.S.-owned and another ten foreign-owned banks. Although they are generally the largest institutions we supervise, they warrant the greater attention not only because of their size, but also because of their on-and off-balance sheet activities, their broad range of products and services, their more complex domestic and international oversight structure, and their role in payment and settlement systems. We refer to them as LCBOs, for “large, complex banking organizations.” In supervising these institutions we recognize that each is unique and complex and that it is particularly necessary for our analysts, examiners, and supervisors to understand sound practices within the industry and to compare activities and risk management techniques among institutions. Accordingly, we are taking a “portfolio” approach, whereby we evaluate practices across institutions where we find similar business lines, characteristics, and risk profiles. This approach fosters more informed and consistent supervision among institutions and provides supervisory staff with greater opportunities to identify and promote sound practices. It also accommodates more readily the development and coordination of staff expertise throughout the Federal Reserve System. The Federal Reserve’s supervisory approach toward LCBOs requires ongoing monitoring, including a formal re-evaluation of an institution’s risk profile and a quarterly update of our supervisory plan. This periodic assessment is based, in part, on internal management reports, internal and external audit reports, and publicly available information. Since these organizations typically conduct a broad range of regulated activities, supervisory staff must also frequently communicate and coordinate their own activities with those of other bank and nonbank regulators. Management of this oversight process rests with a senior staff member designated as CPC, or “central point of contact.” That individual, in turn, coordinates virtually all interaction between the Federal Reserve and the institution, and directs an identified team of examination and supervisory staff having specialized skills tailored to the unique profile of the institution. This structure, combined with the ability of the CPC to draw upon additional staff throughout the Federal Reserve System, should promote greater understanding of an institution’s business and risk management process, while reducing our level of intrusion. Indeed, a necessary aspect of our supervisory review is maintaining a steady flow of relevant information about an institution’s exposures and risk management system in order to reduce the timeconsuming and burdensome discovery process often associated with traditional examination and oversight techniques. Periodic review of management reports should not only enhance our knowledge of specific exposures and events, but also provide insights into a bank’s control process and about what information management deems important. In some cases, it may be most convenient to us and to the bank if we have direct access, on-line, to management information. Indeed, that is the case now for a couple of our largest institutions, particularly with respect to the internal audit process. Effective supervision of an LCBO requires a supervisory plan that is tailored to the institution’s current risk profile and organizational and operational structure and that considers the activities of other supervisors — highlighting, once again, the need for communication and coordination. The plan should address the major risks (e.g., credit risk, market risk, and so forth) and should employ followup actions ranging from off-site analysis and meetings with management, to targeted or full-scope examinations. CPCs should also structure the plan to achieve the proper balance of review of risk management practices and transaction testing, the latter relying typically on statistically sound sampling techniques. Information sharing and coordination with other supervisors are key elements of the program and are essential to successful supervision of these large institutions. For this purpose, the Federal Reserve will continue to enhance its base of information technology and extend its resources to other supervisors. Many of you are already aware of an information system we are developing called the Banking Organization National Desktop, or “BOND.” When introduced next year, that system should provide supervisors with both public and confidential information about an institution in a highly user-friendly way. The system should prove particularly helpful in monitoring and evaluating conditions at the largest institutions. For the system to be useful, though, it needs to be used — and to be fed the information people want. These requirements, in turn, require a high degree of security, so that individuals can take comfort that information they put into the system is not misused or misdirected. This aspect of the system has been given great importance and should actually strengthen the level of security surrounding confidential information, while also disseminating necessary information. In supervising LCBOs, we not only expect more of ourselves, we also have higher standards for the institutions. A fundamental tenet of supervision is that the nature of a bank’s risk management process must be consistent with the level of underlying risk. More sophistication is necessary as transaction volume and complexity rise. Credit Risk and Capital Our higher expectations in the level of management skills and sophistication at larger banks will also become more apparent in the years ahead in terms of capital standards. Much has been said recently in supervisory statements and industry publications about the need to revise the 1988 Basel Capital Accord and to improve, more generally, the credit risk management of banks. Credit risk has always been the dominant risk in banking, yet it remains crudely measured. This lack of quantitatively rigorous risk measurement within the industry explains why we developed the current Accord as we did. The trouble is that measuring credit risk is hard. Experienced bankers and examiners can usually distinguish a good loan from a bad one, but quantifying the level of risk on a portfolio basis and bankwide is quite a different matter. Much attention has been devoted to the exercise within the industry and among bank supervisors, but no solution is at hand. Best practice banks and early research at the Federal Reserve suggest that significant strides are being made in credit risk management, but the industry and regulators still have a long way to go. It is — and should be — the highest priority for the industry and the supervisors. Last year, as you may recall, the United States and the other countries represented on the Basel Supervisors Committee adopted new capital requirements for trading activities that are based on a bank’s internal measure of “value at risk.” That regulatory amendment represented an important shift in regulatory thinking and a greater willingness by the regulatory community to build on risk management practices of banks. With market risk, though, the basic elements of the “value-at-risk” measure were relatively well established, although most institutions still needed to strengthen certain aspects of their calculations and management processes. In that exercise, the necessary data for identifying current trading positions and measuring the historical volatility of their market values were also generally available. The mark-to-market process and short horizon of daily trading also helped greatly in evaluating the effectiveness and overall “accuracy” of the market risk models. None of these crucial elements exists today for measuring credit risk, and industry practice has not yet converged around a particular measure of credit risk, or even a conceptual definition of credit loss. Some models, for example, identify a loss only when a borrower defaults, largely reflecting the view that the bank will hold the asset until it matures. If the model forecasts a default during the relevant time horizon, it then calculates an expected loss, or “loss rate, given default.” Other models take more of a mark-to-market approach, recognizing the gains or losses in the economic value of a loan portfolio resulting not only from defaults or expected defaults, but also from changes in the credit quality of a borrower or from different market and economic conditions. As you can sense, model structures and assumptions become crucial. Moreover, the fundamental input — a borrower’s credit risk rating — can be highly subjective and is largely determined internally within the bank. Some borrowers have public debt ratings from recognized rating agencies, but most do not. Even a public rating needs to be translated into the rating schedule of each bank. This lack of credit risk data is a serious weakness, with even large banks lacking enough historical default experience for a given borrower type to determine appropriate capital charges without substantial judgmental input. The subjective and variable quality of risk ratings, lack of historical data, and the long time horizon before answers are known about a portfolio’s underlying strength make validating credit risk models a difficult task. If more risk-sensitive models are to be used for regulatory capital standards, these differences become more important because they can have material effects on competition and on the safety and soundness of banks, both domestically and abroad. Moreover, unlike trading activities, where the related capital requirements represent a small part of the total, credit risk counts. We need to get this measure right for obvious reasons. “Getting it right” means also providing the proper risk management incentives to banks. Far more needs to be done in measuring and managing credit risk than has been done so far by U.S. and foreign banks. As I noted, much progress has been made in recent years, make no mistake. But much more progress is necessary before most large banks, themselves, can gain a solid grasp on their risk exposures for risk management purposes, let alone before supervisors will be able to substantially revise the Capital Accord. In recent months Federal Reserve staff visited a number of large money center banks to understand better what role credit risk models perform now in senior management’s internal assessments of the institution’s capital adequacy. While, again, progress is being made, the results were somewhat disappointing. Nearly all institutions indicated that in their own internal reviews, they focused largely on factors such as their targeted external credit rating and their regulatory risk-based capital ratios relative to those of their primary competitors. If these figures were in line, they generally viewed their capital as adequate. Although the targeted and actual ratios were significantly above regulatory minimums, these responses were disappointing, indeed. It should be noted that a key ingredient in rating agency evaluations of bank capital is the risk-based capital ratio. While we regulators are flattered by the use of our capital standard for internal and marketplace analysis, we must emphasize that the well-known shortcomings of the standard make it an inappropriate tool for many internal and market purposes. We expect institutions to be ahead of regulators in this analysis, not the other way around. Where models are available, generally pertaining to commercial credits, they are used principally in setting concentration and exposure limits, pricing, and evaluating performance on a risk/return basis. Important uses, for sure, but in no case did management indicate their risk measures offered a significant input to evaluating the institution’s overall capital adequacy. This assessment is not intended to be pessimistic. I believe significant progress can be made if sufficient attention and resources are devoted to the effort, and if the industry is given the right incentives to make it work. Supervisors can provide some of the incentives — both the carrot of an improved capital standard and a better risk management process, and the stick that management will be judged, in part, by its ability to quantify risk. In large part, virtue can also be its own reward. Banks that effectively measure and manage risk will make and price credit better. New Capital Proposal As I mentioned earlier, the Basel Committee on Banking Supervision has now released its long awaited consultative report on revisions to the 1988 Basel Capital Accord. For the largest institutions, the Accord has increasingly been weakened by the changes that have occurred in financial markets. Most importantly banks here and abroad have been engaging in capital arbitrage techniques designed to move their higher quality, lower-risk assets to securities markets, sometimes reducing their capital charges on these assets more than proportional to the retained risk positions. In addition, the remaining higher-credit risk assets have the same regulatory capital charges as the lower-risk assets that have been securitized, changing the meaning of the resultant capital ratio. For these and other reasons, the 1988 Accord has become increasingly undermined and the risk-weighted capital ratios have become more difficult to interpret. Modifying the Accord is an incredibly complex and difficult procedure, not only because it must be negotiated among 12 nations and affect the policies of many more, but also because the issues are so difficult. As I noted, the underlying approach has three equally important legs, all of which reflect efforts to respond to the evolving changes in financial markets. The first leg is modification of the Capital Accord per se, especially for the large complex banking organizations, the most important of which will be to change the risk-weighting scheme on portfolio assets. The framework calls for moving from a four-weight scheme — with most of the assets at one weight regardless of risk — to multiple and more sensitive risk weights and also includes steps to curtail loopholes dealing with securitization transactions. The risk weights, in turn, would be based perhaps on one or a combination of techniques: external ratings, internal management risk ratings, and/or bank-specific formal risk models. Please note that in each of these, the process is leveraging off the market’s risk evaluation, including what the bank management applies for its own purposes. Consistency and improvement in bank risk management is thus a prerequisite to improved, and more rational, capital regulation. The second leg is increased market discipline. Market discipline, of course, can occur only to the extent that the banks make information available to creditors and counterparties that have the ability to respond to that information. Thus, the consultative document contemplates more transparency about bank risk-taking and controls so that creditors and counterparties can decide more rationally about their required compensation for the risk of dealing with that bank. Of course, the objective is to create the incentives for more rational and efficient risk taking by the bank. The final leg is supervisory review of the capital adequacy of the banking organization. The purpose is twofold: first, to ensure that a bank’s capital position is consistent with its overall risk profile and strategy and, second, to encourage early supervisory intervention. The purpose of this review is to provide supervisory comfort that each bank’s internal process for assessing its capital adequacy, and that each bank’s actual capital levels, are consistent with the scale and complexity of its risk taking activities. In some cases, these reviews may well result in requiring individual banks to hold more capital than the minimum regulatory standard. As we think about capital standards for the years ahead, it seems appropriate to consider a more bifurcated approach: one standard for large, complex institutions; another for most other banks. That direction seems especially necessary if we do pursue a more sophisticated, risk sensitive measure of credit risk to capture developments and techniques at the larger and more complex banking organizations. My sense is that greater complexity would be unnecessary for community banks, where a simpler, less burdensome approach may be quite satisfactory for supervisory purposes for most banks. Nevertheless, all banks should take to heart the message the Federal Reserve and other supervisors are sending about the need for stronger practices for evaluating credit risk. Loan Loss Reserves On the topic of capital, I would like to say a few words about loan loss reserves and the interaction of the Federal Reserve and other federal banking agencies with the Securities and Exchange Commission. In recent months, as you know, the Commission has devoted increased attention to practices of large U.S. banks in setting their level of loan loss reserves. The issue surfaced last fall, when SunTrust was required to reduce its reserves and revise previous financial statements. Since then, several other institutions have been asked to explain their reserve practices to staff of the SEC. As supervisor of these holding companies, the Federal Reserve has been actively involved in this matter from the outset and has urged the Commission to work with us, with the institutions, and with the other banking agencies toward a satisfactory resolution. Obviously, this means reconciling different perspectives on this issue. For example, in light of increased volatility and banking risks in recent years, the banking industry has appropriately maintained robust reserving practices and levels. From a safety and soundness perspective, the Federal Reserve and other banking regulators have expected institutions to maintain strong loan loss reserves that are conservatively measured. In carrying out its responsibilities, the SEC has emphasized the need for financial statements and reported earnings to be transparent and, therefore, for allowances to be adequate but not excessive. Enhanced transparency has also been a critical objective of bank regulators, both domestically and internationally. Last week, press reports characterized the Fed’s position as being different from that of other federal banking agencies. That is not true. The main point of contention between the banking agencies and the SEC appears to be whether the recent guidance issued by the Financial Accounting Standards Board (FASB) represents a mandate to reduce reserves. The Federal Reserve has worked with the SEC to issue guidance emphasizing that the FASB guidance does not mandate any material change and that bank management should feel free to maintain reserves at the high end of a reasonable range. The Federal Reserve’s policy guidance provides background information that is intended to assist institutions and their auditors in understanding the SEC announcement and the FASB article in the broader context of other accounting initiatives and discussions between the SEC and the Federal Reserve on allowance accounting matters. Moreover, our policy letter sends a clear message that the Federal Reserve wants banks to maintain prudent reserving practices and not to over-react as a result of a narrow interpretation of the FASB guidance. The other banking agencies appear less sanguine about the intent of the FASB guidance and have registered protests on Capitol Hill. The Federal Reserve will continue to work with the SEC and the accounting profession in the months to come in providing further information regarding appropriate documentation and other matters. I understand Richard Spillenkothen, the Federal Reserve’s Director of Banking Supervision and Regulation, will also speak to this topic in his comments at lunch. Disclosure and Market Discipline Although we disagree with the need for banking organizations to revise previous financial statements, battling the SEC on many of these issues seems not the proper course. They have an obligation to enforce sound reporting and disclosure practices as best they can, and our financial markets have been well served in the process. The U.S. banking industry has its obligations, too, to manage its risks and to tell its story to bank supervisors, the SEC, and the general public. If for no other reason than the fact that banks today are so large and complex and have the potential to present such widespread risk, these largest institutions, in particular, should be held to high performance and compliance standards. As bank supervisors, we should welcome the market’s help to identify and assess banking risks and to minimize the risk of moral hazard. One approach the Federal Reserve is exploring would enhance the role of investors in bank or bank holding company subordinated debt. Unlike shareholders, who benefit from any gains from excessive risk, subordinated debt holders have only downside risk. As a result, their incentives are similar to those of supervisors and the bank insurance fund: they lose if the bank defaults but they don’t participate in outsized gains. A difficulty, however, in creating a greater role for subordinated debt is determining how to provide investors with adequate and timely information about a bank’s risks and with sufficient leverage to affect management decisions. From the supervisor’s perspective, another difficulty is separating market “noise” in changing yield spreads from meaningful signals they may provide. We will be collecting and analyzing these data in the months ahead and will be evaluating their potential usefulness, both as a tool for supervision and as a market mechanism for providing feedback to banking organizations. Whether or not the exercise proves fruitful, it points in the right direction — providing incentives for greater market discipline and for sound management of banking risks. Conclusion In closing, I would remind you that we are beginning to see slippage in important indicators of industry strength. Though still low by historical standards, the volume of nonperforming assets increased last year for the first time since 1991, with the deterioration concentrated within commercial and industrial loans. Delinquencies in agricultural loans have also risen, as a result of extremely weak markets for many farm products. Continued weakness in much of this sector could begin to weigh on some community banks. The next stress-point for any particular bank may come from poor credit quality, from structural and competitive pressures within the industry, or from many other sources. Fortunately, the U.S. commercial banking system has demonstrated a great deal of strength and resiliency in dealing with challenges of the past, and it still seems as strong and as well positioned overall now to handle stress as it has been in many years. I have no doubt that the U.S. banking system will continue to grow and that it will remain central to the nation’s financial system. To do that, though, requires that we all to adapt to changing times and that banks manage risk carefully in both good times and bad.
|
board of governors of the federal reserve system
| 1,999 | 6 |
Speech by Mr Laurence H Meyer, a member of the Board of Governors of the US Federal Reserve System, at the conference on Reforming Bank Capital Standards in New York on 14 June 1999.
|
Mr Meyer’s remarks on market discipline as a component of banking supervision and regulation Speech by Mr Laurence H Meyer, a member of the Board of Governors of the US Federal Reserve System, at the conference on Reforming Bank Capital Standards in New York on 14 June 1999. Good afternoon. The topic of this conference – reforming bank capital standards – could not be more timely. Reform is very much an issue on the minds of all supervisors and market participants. But regulatory capital standards are only one component of the overall framework for maintaining bank safety and soundness. This overall framework can be described, as in the consultative paper recently released by the Basle Supervisors Committee, in terms of “three pillars” – bank supervision, market discipline, and regulatory capital standards. There are two approaches to assessing the adequacy of the overall framework. First, we could consider merely rebalancing the existing components, in search of the most efficient combination. Some have argued, for example, that enhancing market discipline could permit reduced reliance on the more intrusive and burdensome regulatory and supervisory components. Second, to the extent that recent changes in banking and financial markets have made bank regulation and supervision more difficult, we may also need to incrementally improve capital standards and supervisory practices as well as enhance market discipline. My remarks today will focus on the market-discipline component of the three-pillars framework, specifically on how we might enhance market discipline in banking as we adapt to changes in banking and financial markets that have made bank supervision and regulation more difficult. There is an irony here in that it might take additional regulation – for example, increased disclosures and/or a mandatory subordinated debt requirement – to enhance market discipline. I will also discuss practical issues that must be considered and questions that must be answered if we are to move in this direction. Adapting to change As we all know, financial markets and institutions are evolving at a rapid and unprecedented pace. This evolution has been driven in part by statutory reforms and dramatic regulatory changes. The abolition of interstate banking constraints has allowed for the creation of a growing number of very large banking organizations. The erosion of legal and regulatory barriers has permitted banking organizations to expand their scope of activities. And both the relaxation of trade barriers and the freer flow of capital have facilitated the operation of banks across national boundaries. Financial and technological innovations have had an equally dramatic effect on financial markets and institutions. As a result of technological innovations, the increased speed and reduced cost of transacting have improved the depth and liquidity of financial markets. These improvements, together with advances in financial theory, have led to the adoption of new and arguably more complex tools for measuring, taking, and controlling risks. The growing size and complexity of banking organizations make the supervisor’s job of protecting bank safety and soundness increasingly difficult. Size, scope, and complexity simply make it more difficult for supervisors to understand and evaluate bank positions and operations. In response, heightened supervisory focus on risk-management procedures and policies has been under way for some time. This focus recognizes that a bank’s own riskmanagement process is the linchpin for controlling risks. However, while new procedures, policies, and tools for risk management may ultimately buttress supervision and regulation, these tools are based on relatively recent financial theories that have yet to be tested under the full range of market conditions. Moreover, the sophistication and complexity of these new tools often make it more difficult, not less, for supervisors to assess the true risk of a banking organization and to assign appropriate capital requirements. Adding to these difficulties, supervisors must account for risk exposures that are altered at an ever faster pace. We have often said that, in this environment, we want supervision and regulation to simulate or mimic market discipline in the sense of creating the proper incentives, costs, and rewards. I also believe that we ought – where we can – to skip the middlemen and go right to our first line of defense: market discipline. By aligning market incentives with regulatory incentives, policies designed to harness market forces could complement bank supervision by encouraging banks to refrain from excessive risk-taking. Indeed, I believe that market discipline is a particularly attractive tool for encouraging safety and soundness in a rapidly evolving environment. Market discipline is inherently flexible and adaptive with respect to innovations, since market participants have incentives to change the ways that they evaluate risks as innovations are adopted. Market discipline as a complement to supervision and regulation Before discussing how market discipline might complement bank supervision and regulation, it is useful to discuss how market discipline works. It seems to operate through two channels. “Direct” market discipline is exerted through risk-sensitive debt instruments when a banking organization’s expected cost of issuing those instruments increases substantially with an increase in its risk profile. For this to occur, investors must gather and collect information about the banking organization’s risks and prospects, and then incorporate that information into their decisions to buy the organization’s debt. The anticipation of higher funding costs provides an incentive for the banking organization to refrain from excessive risk-taking. “Indirect” market discipline is exerted through risk-sensitive debt and equity instruments when private parties, and possibly government supervisors, monitor secondary market prices of those instruments in order to help determine the risk exposure (or default probability) of a banking organization. In response to perceived increases in bank risk, such parties could then take a variety of actions that increase bank operating costs. For example, purchasers of bank claims could increase the bank’s cost of funds and limit its supply of credit, and both private counterparties and supervisors could reduce the bank’s ability to engage in certain types of contracts. The anticipation of these actions, which are essentially various types of penalties, provides banking organizations with incentives to refrain from excessive risk-taking. Market discipline does not come naturally to banking. The federal safety net limits direct market discipline because it reduces the demand for disclosure and the risk-sensitivity of debt holders. Clearly, insured depositors have almost no incentive to penalize banks for excessive risk-taking. And, uninsured depositors, because of depositor preference laws, may also perceive relatively little need to impose higher costs on banks for excessive risk-taking. Given these incentives, secondary market rates and spreads on these debt instruments would be inadequate – if not irrelevant – barometers of a bank’s risks and would therefore generate little indirect market discipline. Further, the real and perceived certification of soundness provided by supervisory authorities may also reduce the demand for disclosures and the risk-sensitivity of debt holders. Compounding these disincentives for investors to evaluate bank risks, the raison d’être of banks is that these institutions provide credit in environments characterized by asymmetric information. Therefore, banks are inherently opaque and difficult to assess. Nevertheless, there seems to be fairly strong statistical and anecdotal evidence supporting the view that both direct and indirect market discipline currently are exerted on large banking organizations. With respect to direct market discipline, econometric studies of the relationship between deposit growth and portfolio risk have generally found that uninsured depositor holdings decline with increases in the depository institution’s risk. And, other econometric studies have found that rates on uninsured certificates of deposit are sensitive to measures of risk. Supervisory experience is consistent with both of these observations. Other types of bank liabilities also appear to be sensitive to risk. For example, during periods of financial stress, riskier banking organizations tend voluntarily not to issue subordinated debt. This is precisely what would be expected if the subordinated debt market imposed risk premiums on banking organizations. Above and beyond this implication that issuance costs are risk sensitive, this empirical evidence suggests that direct market discipline is substantial enough in the subordinated debt market to affect actual decisions made by banking organizations. The evidence with respect to indirect market discipline is also encouraging. Studies that have considered recent secondary market spreads on subordinated debt have found them to be statistically sensitive to various measures of risk. Importantly, while risk-sensitivity of subordinated debt spreads is necessary for market participants to exert indirect market discipline on banking organizations, it is not sufficient. Market participants outside of the subordinated debt market also must monitor these spreads to assess the condition of the banking organization. Indeed, market participants confirm that the “Street,” not just the bond market, appears to pay considerable attention to such spreads. On balance, the empirical evidence together with anecdotal evidence from the market indicates that secondary market subordinated debt spreads are generating indirect market discipline on banking organizations. While market discipline is currently exerted directly and indirectly on large U.S. banking organizations, the strength of this discipline could be enhanced by policymakers in a number of promising ways. For example, a policy improving disclosures of bank risk exposures and internal capital assessments could potentially improve the market’s ability to assess risks. Another option is for supervisory policy to enhance indirect market discipline by linking supervisory actions to secondary market information. For example, secondary market information could be used to help time bank examinations, to possibly limit bank activities, or to potentially raise bank capital requirements. In this way, market discipline might strengthen bank supervision. While the evidence is not yet clear whether secondary market indicators provide information that the supervisor does not yet have, at worst such indicators could confirm supervisory views or could prompt supervisors to reassess their appraisals of banks. Using subordinated debt to enhance market discipline A promising approach to enhance market discipline, which has received considerable renewed attention of late, is to adopt a subordinated debt policy. There are a number of features of subordinated debt that make it particularly attractive for providing increased market discipline. First of all, subordinated debt is the most junior of all bank liabilities. Therefore, these bondholders are the least likely to be bailed out in the event of bank failure, and the most likely to demand disclosures of a bank’s condition. Second, subordinated debt holders do not partake in the upside gains associated with risk-taking. Hence, at least in principle, the issuance and secondary market spreads on subordinated debt should be particularly sensitive to banking organization risk. In contrast, since equity holders may also benefit from the upside gains associated with risk-taking, equity issuance may provide inadequate direct market discipline, and the signals of bank risk derived from secondary market prices may be blurred and difficult to interpret. In addition, subordinated debt has a relatively long maturity. This feature magnifies the risksensitivity of the debt and reduces the probability of a “silent run” on the bank occurring when the debt becomes due. Subordinated debt issued in place of insured deposits also provides an extra “cushion” for the deposit insurance fund in the event of bank failure. Subordinated debt is also attractive from a market discipline perspective because there exists a well-established, deep, and fairly liquid market for such instruments. Market participants claim that bond issues of $150 million or more are traded in liquid markets – a requirement satisfied by very large bank holding companies and a much smaller number of very large banks. The standardization of publicly traded subordinated debt of banking organizations is also striking and desirable from a market discipline perspective. The majority of U.S. bank or (more commonly) holding company subordinated debt instruments being issued today are fixed-rate, noncallable, 10-year maturity bonds with few bells and whistles. These two features of the market, liquidity and standardization, facilitate the comparison by market participants of secondary market subordinated debt spreads. The finding in recent empirical research that spreads are sensitive to banking organization risks further supports the depth of the secondary market. Not surprisingly, market participants claim routinely to monitor such spreads for various peer groups, which is consistent with the imposition of indirect market discipline on these banking organizations, and some of this discipline is no doubt passed through to banks, particularly if the bank makes up a sizable fraction of the bank holding company. Existing proposals Based on the appealing characteristics of subordinated debt, many observers have called for requiring banking organizations to issue subordinated debt and some have also called for frequent issuance of such debt. Requiring banking organizations to issue subordinated debt frequently would force them to issue risk-sensitive debt, rather than insured deposits, when the bank’s risk has increased. Without such a requirement, there is empirical evidence that risky banks tend to shift their funding sources toward insured deposits and away from risksensitive securities. This evidence provides important motivation for a policy that would require a banking organization to regularly issue subordinated debt. In short, mandatory and regular subordinated debt issuance would weaken a banking organization’s ability to shield itself from direct market discipline. Existing proposals for mandatory subordinated debt typically share three common elements: first, that organizations be required to issue subordinated debt; second, that the subordinated debt be held by independent third parties; and third, that the bank have total subordinated debt outstanding in excess of 2 percent of its risk-weighted assets. There are, however, a number of other practical issues that have to be considered in designing an operational mandatory subordinated debt policy and many of these involve important tradeoffs that have to be weighed in deciding how to proceed. Practical considerations and details of a mandatory subordinated debt proposal Only large banks? Some proposals would only require large banks to issue subordinated debt. This approach is consistent with a theme I have been emphasizing, the importance of differentiation in regulatory standards and supervisory practice between the largest, most complex and internationally active banks and all others. As we begin to think of reform of the capital standards, for example, I expect we will move to a bifurcated approach in the United States, applying the revised Basle Accord only to large, complex, and internationally active banks and designing a simpler, less burdensome approach for the overwhelming majority of U.S. banks. In our supervisory program, the Federal Reserve is already focusing increased attention on a small number of large, complex domestic and foreign-owned banking organizations. It is sensible that any effort to enhance market discipline should also be focused on those banks. Several arguments can be advanced that suggest that a policy focused on large banks would get the most bang for the regulatory buck. Such banks hold the most significant systemic risk potential, and most of the banking system’s assets are in such organizations. These are the institutions that have become larger, more varied in their services and practices, and are more complex and, as a result, are more difficult to supervise. It is also the case that subordinated debt issues of the largest banks are more likely to be large enough to ensure a liquid market for the instrument. Finally, large banking organizations are already voluntarily issuing a significant amount of subordinated debt, so that a mandatory policy could be introduced with minimal transition costs. Bank or bank holding company? Interestingly, the top fifty U.S. insured commercial banks on average already finance in excess of 2 percent of their risk weighted assets with subordinated debt. Thus, many large banks already issue subordinated debt in the amounts stipulated in many of the existing subordinated debt proposals. This is not to say, however, that a 2 percent subordinated debt requirement would have no bite. Currently, most bank subordinated debt is held by the parent holding company and hence is not traded. Thus requiring a bank to issue tradable debt would likely increase both direct and indirect discipline. Moreover, in the absence of such a requirement, risky banking organizations could shift into risk-insensitive deposits and, in effect, avoid market discipline. With the requirement, this option would be closed and riskier banks would have higher funding costs. Most of the largest bank holding companies already have 2 percent or more of their riskweighted assets in subordinated debt, and, in this case, the debt is publicly traded. While such debt issue is voluntary, these organizations typically come to the market to issue subordinated debt at least once or twice a year. Most subordinated debt proposals focus on banks, however, and there are strong public policy reasons for doing so. Insured commercial banks have direct access to the federal safety net, and thus, banks are where the dangers of moral hazard and the consequent risks to the taxpayer are concentrated. The commercial bank is the primary concern of supervision and regulation, and where the supervisors most need the market’s help. It follows that a subordinated debt policy should be focused on banks and not on their parent or affiliate organizations. In addition, subordinated debt issued at the bank level can provide increased protection for the deposit insurance fund. And, a policy focused on banks would reinforce the regulatory philosophy that the safety net and associated policies are limited to just commercial banks. How frequently should debt issuance be required? In the design of a subordinated debt policy, one also needs to analyze what frequency of issuance would be required. On the one hand, frequent issuance could improve the quality of the signals provided by spreads of subordinated debt in the secondary market, because the issuance process generally involves increased disclosure. This boost in the information content of secondary prices may be particularly important during periods of financial stress. More generally, frequent renewal of the information content of secondary prices may be highly beneficial as financial and technological innovations allow banking organizations to change their financial condition rapidly. Frequent issuance may also result in lower spreads as the market’s familiarity with the issuers increases. This would, of course, reduce the cost of a subordinated debt requirement. On the other hand, a lower required frequency of issuance may allow banks to signal their financial condition through their timing of issuance. Flexibility with respect to issuance may also allow banks to avoid the unnecessary cost of issuing subordinated debt during periods in which the bond market is turbulent. On balance, a mandated frequency of once or twice a year would seem reasonable, and would be in line with current practice for larger banking organizations. Should subordinated debt with standardized characteristics be required? There are also tradeoffs associated with requiring banking organizations to issue a standardized debt instrument with the same maturity, option characteristics, and covenants. The benefit of standardized debt is fairly obvious. It makes it easier for market participants to decipher the signals of a banking organization’s condition. The costs are also pretty clear. A standardized debt instrument could be more costly for some banks to issue than for others because bank capital structures differ across organizations. And, a standardized debt instrument may be very costly during certain market conditions. For example, in periods of actual or expected interest rate volatility, spreads on debt without put options may be relatively high. I would expect that the benefits of standardization in ensuring a purer signal about the relative risk of different banking institutions would outweigh the costs associated with such a restriction. Should put options be required? Some proposals have advocated that the required subordinated debt have put options. These options have been suggested for two reasons. First, they would provide debt holders with a powerful tool for increasing the cost of bank risktaking. With a put option, debt holders would be able to force early repayment of debt when a bank changed its risk profile. Second, under some proposals, put options take the closure decision out of the hands of the regulators and place it in the hands of the debt holders. Not coincidentally, these proposals arose in the wake of the savings and loan crisis during which regulators were criticized for their forbearance. Put options may also increase indirect discipline if they trigger supervisory actions. As disciplinary as they may be, there are strong arguments against the inclusion of put options. First, the exercise of put options can be extremely Draconian, inducing liquidations and possibly premature closures. Second, the high correlation of risks across banks may induce a simultaneous exercise of put options, which could exacerbate or even trigger a systemic crisis. Should there be a cap on the rate paid at issuance for subordinated debt? Other proposals have advocated that a subordinated debt policy should impose maximum caps on rates or spreads over Treasuries with comparable maturities. The primary appeal of such an approach is that direct market discipline would be relatively strong under a rate cap. Banking organizations unable to issue under such a cap would be forced to lower their riskiness by shrinking their assets or by changing their asset mix. A cap could also be used to strengthen indirect discipline. A banking organization’s inability to issue subordinated debt under the cap would send a “red flag” to the market. Alternatively, the cap could be used to trigger supervisory action in the same way that a banking organization’s capital ratios currently trigger prompt corrective action. The downside of a cap is that it would be difficult, perhaps in practice impossible, to determine the optimal rate or spread that should serve as a cap, particularly since the optimal cap would vary with bond market and macroeconomic conditions. A fixed cap might harshly punish all banking organizations unnecessarily when the bond market is highly illiquid. A fixed cap might also be highly procyclical. Banking organizations would be forced to shrink, change their asset mix, or face supervisory discipline during downturns because spreads would be more likely to run into a fixed cap at such times. While some procyclical effects of market discipline are unavoidable, a fixed rate cap may make a market discipline policy so severely procyclical as to be undesirable from a macroeconomic perspective. As one considers the various features that have been recommended in the existing subordinated debt proposals, it is important to keep in mind that there are strong reasons to stay closely aligned with current market practices and conventions. Capitalizing on such conventions could, of course, reduce the potential costs of a subordinated debt policy. And, at the same time, a subordinated debt policy aligned with such conventions could be very effective. Given the current deep and liquid markets for subordinated debt, such a policy would likely improve the information content of secondary market debt spreads. These spreads would facilitate an increase in indirect market discipline. Questions about the value of a subordinated debt requirement It is important to recognize that the costs and benefits of a subordinated debt policy – even one tailored to current market conventions – would vary over time. Subordinated debt in times of stress. A mandatory subordinated debt requirement would likely be most costly to banking organizations when either the markets are under stress, the economy is deteriorating, or the bank itself is in financial difficulty. During these periods, the cost of issuing risk-sensitive securities would likely increase, and, at such times, forced issuance of subordinated debt would be particularly costly. I believe proposals that increase market discipline inevitably risk aggravating instability in times of overall stress. Eliminating deposit insurance, for example, would have the same qualitative outcome. The key in designing approaches to enhance market discipline is therefore to ensure a favorable tradeoff – sufficiently better controlled risk-taking in good times and bad times relative to somewhat aggravated risks during periods of overall stress. Subordinated debt, with its relatively long maturities and therefore limited ability to “run,” appears to offer such a favorable tradeoff.. The cost of reduced funding flexibility. Another major cost of a subordinated debt proposal would be the reduced flexibility in financing, resulting in a somewhat higher cost of financing than would otherwise be available. These higher costs may also vary with business conditions, market conditions, and banking conditions. One of the ironies of a subordinated debt proposal is that it suggests that additional regulation is required to induce additional market discipline. Regulations are never costless, so we must, therefore, ask what additional costs might be imposed as a consequence of the mandate. Of course, to the extent that the proposal follows existing market conventions – in terms of the amount, frequency, etc. – the incremental costs are limited, though, of course, so are the benefits. It is therefore important to be satisfied that the benefits outweigh the costs. More research is important A subordinated debt proposal is, in my judgment, promising and intriguing. Still, there remain questions to be answered. To move in the direction of answers, the Federal Reserve is working to improve the data it has available on the market price of subordinated debt issued by banks and bank holding companies, as well as other market data that could be useful in signaling changes in the risk profiles of banking organizations. We will be evaluating the degree to which prices of market instruments track the changing risk profiles of banking organizations, assessing the usefulness of such market signals in the surveillance of the financial conditions of large, complex banking organizations, and evaluating the potential usefulness of such data in the supervisory process. We believe that before we seriously consider imposing a mandate related to subordinated debt, we should carefully study how the existing market functions and the degree to which current practices may already be fulfilling many of the objectives of a mandatory system. In addition, we will be focusing increased research effort on topics related to market discipline in general and subordinated debt in particular. We must get a reasonable estimate of how much additional market discipline would be imposed by forced issuance of risk-sensitive debt. How much more effective would subordinated debt holders be than uninsured deposit holders when they raise funding costs for a risky bank? Does the strength of penalties associated with bank debt issuance vary systematically across bank liabilities or with the business cycle? This research may also help us understand how to strike a balance between supervision, regulation, and market discipline in order to most effectively achieve the safety and soundness of our financial system. Conclusion I hope my review of the difficulties and challenges associated with developing an operationally feasible market discipline policy has not been disheartening. Such has not been my intention. Rather, I have sought to realistically review the practical issues and tradeoffs that need to be resolved. When all is said and done, however, it seems clear that market discipline remains our first line of defense. It is perhaps the most flexible option for maintaining bank safety and soundness in a rapidly evolving environment and has the potential to strengthen and complement bank supervision and regulation, particularly on the outside chance that the market knows best. While I believe that more research is needed to make the case for a policy to enhance market discipline through subordinated debt, and to pin down the design features of a specific policy for such instruments, we should not ignore the abundant evidence that highlights the promise of market discipline in general and – perhaps – subordinated debt in particular.
|
board of governors of the federal reserve system
| 1,999 | 6 |
Remarks by Mr Roger W Ferguson Jr, a member of the Board of Governors of the US Federal Reserve System and Chairman of the Joint Year 2000 Council, at the Second Global Y2K National Coordinators Conference held at the United Nations in New York on 22 June 1999.
|
Mr Ferguson looks at public information and confidence as concerns posed by the Year 2000 problem Remarks by Mr Roger W Ferguson Jr, a member of the Board of Governors of the US Federal Reserve System and Chairman of the Joint Year 2000 Council, at the Second Global Y2K National Coordinators Conference held at the United Nations in New York on 22 June 1999. Public Information and Public Confidence In Year 2000 Year 2000 is a unique problem that has various dimensions. It started as a technical problem, has progressed to being a senior management business issue and is now becoming a public confidence concern. In addition, it is unique in that we all know that the century date change will occur, but what will occur on that date is still open to differing perspectives. This is certainly a case in which the future is opaque. In this environment, there are, and will increasingly be, a variety of views and analyses. Some of them will be fact-based and responsible; others will perhaps be less analytical and more emotional. Why is Public Disclosure Important? Public disclosure of each business’s plans, and indeed of country plans, for remediation, testing and renovation of systems, as well as general outlines for contingency plans are all critical. This is axiomatic, but it might be helpful to explain why. In a business context, individual counterparties, i.e. suppliers, customers, creditors and others, are all working to develop business continuity plans. Full and comprehensive information is essential to that process. The readiness of counterparties is being evaluated based on available information. Inadequate details in a business environment can well lead to negative perceptions in the marketplace, which may have pronounced market and/or economic consequences. Not surprisingly, in a number of instances, earlier pessimistic perceptions have been positively revised with greater disclosure of information. With respect to public confidence, Year 2000 seems to be an issue in which a better-informed public is a more confidant and calm public. For example, in the United States the financial regulators have recently completed a survey of public perceptions on this matter. The results show that those with the greatest exposure to the Year 2000 issue were more likely to believe that any problems that emerge will be short-lived and subject to repair. More broadly in the context of public confidence, we know that the public will become aware of and develop a heightened interest in Year 2000 at differing times. In an environment in which public perceptions emerge slowly, I believe that it is important to maintain a steady flow of accurate information to the public domain. As the public gradually awakens to this topic, a stream of balanced, factually-accurate and timely information can potentially counter the more sensationalist coverage that is surely going to emerge. In addition, it is important for information on Year 2000 to be placed in the proper context and in perspective. Accurate information from reputable sources will be important in providing that context. Finally, public relations experts indicate that public perceptions will firm at some point, and be less subject to revision. Since we do not know when that point will occur, it is important for accurate information to be always available. The frequency and volume of the more balanced messages may have to increase as we all get closer to the date change, but information sharing should not wait until the last moment. What Has Been Done? Fortunately, those who have spent time over the last several years considering the Year 2000 problem are aware of the need for information sharing to maintain business and public confidence. The Joint Year 2000 Council in January 1999 issued a Bulletin to the world’s financial regulators entitled “Focus on Information Sharing”, which called for enhanced disclosure and information sharing and summarized the work of a guidance paper on the same topic. Earlier this month we reinforced this message with a short advisory to financial regulators worldwide. The Global 2000 Co-ordinating Group, a committee of banks, securities firms, and insurance companies from throughout the world, provides a tool on their web site for financial institutions to disclose their Year 2000 plans and status. Global 2000 is a member of the Joint Council’s External Consultative Committee and we endorse their disclosure efforts. In the United States, the financial regulators have issued guidance to all depository institutions that includes a requirement for them to have public awareness efforts underway to inform their customers of the steps being taken to prepare for the century date change. Similarly, regulators have reminded financial institutions that they should understand the degree of preparedness of their major customers. Finally, our Securities and Exchange Commission has mandated disclosure from all companies with public shareholders. It is important for us to understand that the information sharing being called for does not require guarantees of complete preparedness. Rather, honest disclosure of plans and progress toward completion of those plans is what is called for. What Should You Do? Given what I have said, it is clear that we each have a role to play in this unfolding event. If you represent a national Year 2000 authority, as most do, your role is twofold. First, I believe that you should encourage honest disclosure from government entities on their degree of preparedness and the general outline of contingency plans that they are making. Also, regulators should use the weight of regulatory control to encourage regulated entities to engage in voluntary self-disclosure. If you represent the media, you have a special obligation to engage in balanced and accurate reportage, not attempting to hide important facts, but not seeking the most sensational coverage. A broad synthesis that captures facts, not drama, is probably most useful. Finally, all of us as members of the public have an obligation to listen to the most responsible voices. We should recall that in times of uncertainty, many views are expressed; only the most factually accurate deserve our attention.
|
board of governors of the federal reserve system
| 1,999 | 6 |
Testimony of the Chairman of the Board of Governors of the US Federal Reserve System, Mr Alan Greenspan, before the Joint Economic Committee of the US Congress on 17 June 1999.
|
Mr Greenspan’s testimony on monetary policy and the economic outlook in the United States Testimony of the Chairman of the Board of Governors of the US Federal Reserve System, Mr Alan Greenspan, before the Joint Economic Committee of the US Congress on 17 June 1999. As emphasized by the important hearings this committee has held in the past few days, an impressive proliferation of new technologies is inducing major shifts in the underlying structure of the American economy. These fundamental changes appear to be far from complete. The way America does business, including the interaction among the various economic players in our economy, is in the midst of a significant transformation, though the pace of change is unclear. As a consequence, many of the empirical regularities depicting the complex of economic relationships on which policymakers rely have been markedly altered. The Federal Reserve has thus been pressed to continuously update our understanding of how the newer forces are developing in order for us to address appropriately our underlying monetary policy objective: maximum sustainable economic growth. The failure of economic models based on history to anticipate the acceleration in productivity contributed to the recent persistent underprediction of economic growth and overprediction of inflation. Guiding policy by those models doubtless would have unduly inhibited what has been a remarkable run of economic prosperity. And yet, while we have been adjusting the implicit models of the underlying economic forces on which we base our decisions, certain verities remain. Importantly, the evidence has become increasingly persuasive that relatively stable prices – neither persistently rising nor falling – are more predictable hence result in a lower risk premium for investment. Because the nation’s level of investment, to a large extent, determines our prosperity over time, stability in the general level of prices for goods and services is clearly a necessary condition for maximum sustainable growth. However, product price stability does not guarantee either the maintenance of financial market stability or maximum sustainable growth. As recent experience attests, a prolonged period of price stability does help to foster economic prosperity. But, as we have also observed over recent years, as have others in times past, such a benign economic environment can induce investors to take on more risk and drive asset prices to unsustainable levels. This can occur when investors implicitly project rising prosperity further into the future than can reasonably be supported. By 1997, for example, measures of risk had fallen to historic lows as businesspeople, having experienced years of continuous good times, assumed, not unreasonably, that the most likely forecast was more of the same. The Asian crisis, and especially the Russian devaluation and debt moratorium of August 1998, brought the inevitable rude awakening. In the ensuing weeks, financial markets in the United States virtually seized-up, risk premiums soared, and for a period sellers of even investment grade bonds had difficulty finding buyers. The Federal Reserve responded with a three step reduction in the federal funds rate totaling 75 basis points. Market strains receded – whether as a consequence of our actions or of other forces – and yield spreads have since fallen but not all the way back to their unduly thin levels of last summer. The American economy has retained its momentum and emerging economies in Asia and Latin America are clearly on firmer footing, though in some cases their turnarounds appear fragile. The recovery of financial markets, viewed in isolation, would have suggested that at least part of the emergency injection of liquidity, and the associated 75 basis point decline in the funds rate, ceased to be necessary. But, with wage growth and price inflation declining by a number of measures earlier this year, and productivity evidently still accelerating – thereby keeping inflation in check – we chose to maintain the lower level of the funds rate. While this stellar noninflationary economic expansion still appears remarkably stress free on the surface, there are developing imbalances that give us pause and raise the question: Do these imbalances place our economic expansion at risk? For the period immediately ahead, inflationary pressures still seem well contained. To be sure, oil prices have nearly doubled and some other commodity prices have firmed, but large productivity gains have held unit cost increases to negligible levels. Pricing power is still generally reported to be virtually nonexistent. Moreover, the re-emergence of rising profit margins, after severe problems last fall, indicates cost pressures on prices remain small. Nonetheless, the persistence of certain imbalances pose a risk to the longer-run outlook. Strong demand for labor has continued to reduce the pool of available workers. Data showing the percent of the relevant population who are not at work, but would like a job, are around the low for this series, which started in 1970. Despite its extraordinary acceleration, labor productivity has not grown fast enough to accommodate the increased demand for labor induced by the exceptional strength in demand for goods and services. Overall economic growth during the past three years has averaged four percent annually, of which roughly two percentage points reflected increased productivity and about one point the growth in our working age population. The remainder was drawn from the ever decreasing pool of available job seekers without work. That last development represents an unsustainable trend that has been produced by an inclination of households and firms to increase their spending on goods and services beyond the gains in their income from production. That propensity to spend, in turn, has been spurred by the rise in equity and home prices, which our analysis suggests can account for at least one percentage point of GDP growth over the past three years. Even if this period of rapid expansion of capital gains comes to an end shortly, there remains a substantial amount in the pipeline to support outsized increases in consumption for many months into the future. Of course, a dramatic contraction in equity market prices would greatly reduce this backlog of extra spending. To be sure, labor market tightness has not, as yet, put the current expansion at risk. Despite the ever shrinking pool of available labor, recent readings on year-over-year increases in labor compensation have held steady or, by some measures, even eased. This seems to have resulted in part from falling inflation, which has implied that relatively modest nominal wage gains have provided healthy increases in purchasing power. Also, a residual fear of job skill obsolescence, which has induced a preference for job security over wage gains, probably is still holding down wage levels. But should labor markets continue to tighten, significant increases in wages, in excess of productivity growth, will inevitably emerge, absent the unlikely repeal of the law of supply and demand. Because monetary policy operates with a significant lag, we have to make judgments, not only about the current degree of balance in the economy, but about how the economy is likely to fare a year or more in the future under the current policy stance. The return of financial markets to greater stability and our growing concerns about emerging imbalances led the Federal Open Market Committee to adopt a policy position at our May meeting that contemplated a possible need for an upward adjustment of the federal funds rate in the months ahead. The issue is what policy setting has the capacity to sustain our remarkable economic expansion, now in its ninth year. This is the question the FOMC will be addressing at its meeting at the end of the month. One of the important issues for the FOMC as it has made such judgments in recent years has been the weight to place on asset prices. As I have already noted, history suggests that owing to the growing optimism that may develop with extended periods of economic expansion, asset price values can climb to unsustainable levels even if product prices are relatively stable. The 1990s have witnessed one of the great bull stock markets in American history. Whether that means an unstable bubble has developed in its wake is difficult to assess. A large number of analysts have judged the level of equity prices to be excessive, even taking into account the rise in “fair value” resulting from the acceleration of productivity and the associated long-term corporate earnings outlook. But bubbles generally are perceptible only after the fact. To spot a bubble in advance requires a judgment that hundreds of thousands of informed investors have it all wrong. Betting against markets is usually precarious at best. While bubbles that burst are scarcely benign, the consequences need not be catastrophic for the economy. The bursting of the Japanese bubble a decade ago did not lead immediately to sharp contractions in output or a significant rise in unemployment. Arguably, it was the subsequent failure to address the damage to the financial system in a timely manner that caused Japan’s current economic problems. Likewise, while the stock market crash of 1929 was destabilizing, most analysts attribute the Great Depression to ensuing failures of policy. And certainly the crash of October 1987 left little lasting imprint on the American economy. This all leads to the conclusion that monetary policy is best primarily focused on stability of the general level of prices of goods and services as the most credible means to achieve sustainable economic growth. Should volatile asset prices cause problems, policy is probably best positioned to address the consequences when the economy is working from a base of stable product prices. For monetary policy to foster maximum sustainable economic growth, it is useful to pre-empt forces of imbalance before they threaten economic stability. But this may not always be possible – the future at times can be too opaque to penetrate. When we can be pre-emptive we should be, because modest pre-emptive actions can obviate the need of more drastic actions at a later date that could destabilize the economy. The economic expansion has generated many benefits. It has been a major factor in rebalancing our federal budget. But more important, a broad majority of our people have moved to a higher standard of living, and we have managed to bring into the productive workforce those who have too long been at its periphery. This has enabled large segments of our society to gain skills on the job and the self-esteem associated with work. Our responsibility, at the Federal Reserve and in Congress, is to create the conditions most likely to preserve and extend the expansion. Should the economic expansion continue into February of next year, it will have become the longest in America’s economic annals. Someday, of course, the expansion will end; human nature has exhibited a tendency to excess through the generations with the inevitable economic hangover. There is nothing in our economic data series to suggest that this propensity has changed. It is the job of economic policymakers to mitigate the fallout when it occurs, and, hopefully, ease the transition to the next expansion.
|
board of governors of the federal reserve system
| 1,999 | 7 |
Testimony by the Chairman of the Board of Governors of the US Federal Reserve System, Mr Alan Greenspan, before the Committee on Banking and Financial Services of the US House of Representatives on 22 July 1999 (July 1999 Humphrey-Hawkins report).
|
Mr Greenspan presents the US Federal Reserve’s semiannual report on monetary policy Testimony by the Chairman of the Board of Governors of the US Federal Reserve System, Mr Alan Greenspan, before the Committee on Banking and Financial Services of the US House of Representatives on 22 July 1999 (July 1999 Humphrey-Hawkins report). Thank you, Mr. Chairman and other members of the Committee, for this opportunity to present the Federal Reserve’s semiannual report on monetary policy. To date, 1999 has been an exceptional year for the American economy, but a challenging one for American monetary policy. Through the first six months of this year, the U.S. economy has further extended its remarkable performance: Almost 1¼ million jobs were added to payrolls on net, and gross domestic product apparently expanded at a brisk pace, perhaps near that of the prior three years. At the root of this impressive expansion of economic activity has been a marked acceleration in the productivity of our nation’s workforce. This productivity growth has allowed further healthy advances in real wages and has permitted activity to expand at a robust clip while helping to foster price stability. Last fall, the Federal Open Market Committee (FOMC) eased monetary policy to counter a seizing-up of financial markets that threatened to disrupt economic activity significantly. As those markets recovered, the FOMC had to assess whether that policy stance remained appropriate. By late last month, when it became apparent that much of the financial strain of last fall had eased, that foreign economies were firming, and that demand in the United States was growing at an unsustainable pace, the FOMC raised its intended federal funds rate ¼ percentage point, to 5%. To have refrained from doing so in our judgment would have put the U.S. economy’s expansion at risk. If nothing else, the experience of the last decade has reinforced earlier evidence that a necessary condition for maximum sustainable economic growth is price stability. While product prices have remained remarkably restrained in the face of exceptionally strong demand and expanding potential supply, it is imperative that we do not become complacent. The already shrunken pool of job-seekers and considerable strength of aggregate demand suggest that the Federal Reserve will need to be especially alert to inflation risks. Should productivity fail to continue to accelerate and demand growth persist or strengthen, the economy could overheat. That would engender inflationary pressures and put the sustainability of this unprecedented period of remarkable growth in jeopardy. One indication that inflation risks were rising would be a tendency for labor markets to tighten further. But the FOMC also needs to continue to assess whether the existing degree of pressure in these markets is consistent with sustaining our low-inflation environment. If new data suggest it is likely that the pace of cost and price increases will be picking up, the Federal Reserve will have to act promptly and forcefully so as to preclude imbalances from arising that would only require a more disruptive adjustment later – one that could impair the expansion and bring into question whether the many gains already made can be sustained. Recent Developments A number of important forces have been shaping recent developments in the U.S. economy. One has been a recovery of financial markets from the disruptions of last fall. By the end of 1998, the extreme withdrawal from risk-taking and consequent seizing-up of markets had largely dissipated. This year, risk spreads have narrowed further – though generally not to the unrealistically low levels of a year ago – and a heavy volume of issuance in credit markets has signaled a return to their more-normal functioning. Equity prices have risen to new highs and, in the process, have elevated price-earnings ratios to historic levels. Abroad, many financial markets and economies also have improved. Brazil weathered a depreciation of its currency with limited fallout on its neighbors. In Asia, a number of the emerging market economies seemed to be reviving after the trying adjustments of the previous year or so. Progress has not been universal, and in many economies prospects remain clouded, depending importantly on the persistence of efforts to make fundamental reforms whose necessity had been made so painfully obvious in the crises those economies endured. Nonetheless, the risks of further major disruptions to financial and trade flows that had concerned the FOMC when it eased policy last fall have clearly diminished. Improving global prospects also mean that the U.S. economy will no longer be experiencing declines in basic commodity and import prices that held down inflation in recent years. In the domestic economy, data becoming available this year have tended to confirm that productivity growth has stepped up. It is this acceleration of productivity over recent years that has explained much of the surprising combination of a slowing in inflation and sustained rapid real growth. Increased labor productivity has directly limited the rise of unit labor costs and accordingly damped pressures on prices. This good inflation performance, reinforced also by falling import prices, in turn has fostered further declines in inflation expectations over recent years that bode well for pressures on costs and prices going forward. In testimony before this committee several years ago, I raised the possibility that we were entering a period of technological innovation that occurs perhaps once every fifty or one-hundred years. The evidence then was only marginal and inconclusive. Of course, tremendous advances in computing and telecommunications were apparent, but their translations into improved overall economic efficiency and rising national productivity were conjectural at best. While the growth of output per hour had shown some signs of quickening, the normal variations exhibited by such data in the past were quite large. More intriguing was the remarkable surge in capital investment after 1993, especially in high-tech goods, a full two years after a general recovery was under way. This suggested a marked increase in the perceived prospective rates of return on the newer technologies. That American productivity growth has picked up over the past five years or so has become increasingly evident. Nonfarm business productivity (on a methodologically consistent basis) grew at an average rate of a bit over 1% per year in the 1980s. In recent years, productivity growth has picked up to more than 2%, with the past year averaging about 2½%. To gauge the potential for similar, if not larger, gains in productivity going forward, we need to attempt to arrive at some understanding of what has occurred to date. A good deal of the acceleration in output per hour has reflected the sizable increase in the stock of labor-saving equipment. But that is not the whole story. Output has grown beyond what normally would have been expected from increased inputs of labor and capital alone. Business restructuring and the synergies of the new technologies have enhanced productive efficiencies. American industry quite generally has shared an improved level of efficiency and cost containment through high-tech capital investment, not solely newer industries at the cutting edge of innovation. Our century-old motor vehicle industry, for example, has raised output per hour by a dramatic 4½% annually on average in the past two years, compared with a lackluster 1¼% on average earlier this decade. Much the same is true of many other mature industries, such as steel, textiles, and other stalwarts of an earlier age. This has confirmed the earlier indications of an underlying improvement in rates of return on the newer technologies and their profitable synergies with the existing capital stock. These developments have created a broad range of potential innovations that have granted firms greater ability to profitably displace costly factors of production whenever profit margins have been threatened. Moreover, the accelerating use of newer technologies has markedly enhanced the flexibility of our productive facilities. It has dramatically reduced the lead times on the acquisition of new equipment and enabled firms to adjust quickly to changing market demands. This has indirectly increased productive capacity and effectively, at least for now, eliminated production bottlenecks and the shortages and price pressures they inevitably breed. This greater ability to pare costs, increase production flexibility, and expand capacity are arguably the major reasons why inflationary pressures have been held in check in recent years. Others have included the one-time fall in the prices of oil, other commodities, and imports more generally. In addition, a breaking down of barriers to cross-border trade, owing both to the new technologies and to the reduction of government restrictions on trade, has intensified the pressures of competition, helping to contain prices. Coupled with the decline in military spending worldwide, this has freed up resources for more productive endeavors, especially in a number of previously nonmarket economies. More generally, the consequent erosion of pricing power has imparted an important imperative to hold down costs. The availability of new technology to each company and its rivals affords both the opportunity and the competitive necessity of taking steps to reduce costs, which translates on a consolidated basis into increased national productivity. The acceleration in productivity owes importantly to new information technologies. Prior to this IT revolution, most of twentieth-century business decisionmaking had been hampered by limited information. Owing to the paucity of timely knowledge of customers’ needs, the location of inventories, and the status of material flows throughout complex production systems, businesses built in substantial redundancies. Doubling up on materials and staffing was essential as a cushion against the inevitable misjudgments made in real time when decisions were based on information that was hours, days, or even weeks old. While businesspeople must still operate in an uncertain world, the recent years’ remarkable surge in the availability of real-time information has enabled them to remove large swaths of inventory safety stocks, redundant capital equipment, and layers of workers, while arming them with detailed data to fine-tune specifications to most individual customer needs. Despite the remarkable progress witnessed to date, history counsels us to be quite modest about our ability to project the future path and pace of technology and its implications for productivity and economic growth. We must remember that the pickup in productivity is relatively recent, and a key question is whether that growth will persist at a high rate, drop back toward the slower standard of much of the last twenty-five years, or climb even more. By the last I do not just mean that productivity will continue to grow, but that it will grow at an increasingly faster pace through a continuation of the process that has so successfully contained inflation and supported economic growth in recent years. The business and financial community does not as yet appear to sense a pending flattening in this process of increasing productivity growth. This is certainly the widespread impression imparted by corporate executives. And it is further evidenced by the earnings forecasts of more than a thousand securities analysts who regularly follow S&P 500 companies on a firm-by-firm basis, which presumably embody what corporate executives are telling them. While the level of these estimates is no doubt upwardly biased, unless these biases have significantly changed over time, the revisions of these estimates should be suggestive of changes in underlying economic forces. Except for a short hiatus in the latter part of 1998, analysts’ expectations of five-year earnings growth have been revised up continually since early 1995. If anything, the pace of those upward revisions has quickened of late. True, some of that may reflect a pickup in expected earnings of foreign affiliates, especially in Europe, Japan, and the rest of Asia. But most of this year’s increase almost surely owes to domestic influences. There are only a limited number of ways that the expected long-term growth of domestic profits can increase, and some we can reasonably rule out. There is little evidence that company executives or security analysts have significantly changed their views in recent months of the longer-term outlook for continued price containment, the share of profits relative to wages, or anticipated growth of hours worked. Rather, analysts and the company executives they talk to appear to be expecting that unit costs will be held in check, or even lowered, as sales expand. Hence, implicit in upward revisions of their forecasts, when consolidated, is higher expected national productivity growth. Independent data on costs and prices in recent years tend to confirm what aggregate data on output and hours worked indicate: that productivity growth has risen. With price inflation stable and domestic operating profit margins rising, the rate of increase in total consolidated unit costs must have been falling. Even taking into account the evidence of declining unit interest costs of nonfinancial corporations, unit labor cost increases (which constitute three quarters of total unit costs) must also be slowing. Because until very recently growth of compensation per hour has been rising, albeit modestly, it follows that productivity growth must have been rising these past five years, as well. Accelerating productivity is thus evident in underlying consolidated income statements of nonfinancial corporations, as well as in our more direct, though doubtless partly flawed, measures of output and input. That said, we must also understand the limits to this process of productivity-driven growth. To be sure, the recent acceleration in productivity has provided an offset to our taut labor markets by holding unit costs in check and by adding to the competitive pressures that have contained prices. But once output-per-hour growth stabilizes, even if at a higher rate, any pickup in the growth of nominal compensation per hour will translate directly into a more-rapid rate of increase in unit labor costs, heightening the pressure on firms to raise the prices of the goods and services they sell. Thus, should the increments of gains in technology that have fostered productivity slow, any extant pressures in the labor market should ultimately show through to product prices. Meanwhile, though, the impressive productivity growth of recent years also has had important implications for the growth of aggregate demand. If productivity is driving up real incomes and profits – and, hence, gross domestic income – then gross domestic product must mirror this rise with some combination of higher sales of motor vehicles, other consumer goods, new homes, capital equipment, and net exports. By themselves, surges in economic growth are not necessarily unsustainable – provided they do not exceed the sum of the rate of growth in the labor force and productivity for a protracted period. However, when productivity is accelerating, it is very difficult to gauge when an economy is in the process of overheating. In such circumstances, assessing conditions in the labor market can be helpful in forming those judgments. Employment growth has exceeded the growth in working-age population this past year by almost ½ percentage point. While somewhat less than the spread between these growth rates over much of the past few years, this excess is still large enough to continue the further tightening of labor markets. It implies that real GDP is growing faster than its potential. To an important extent, this excess of the growth of demand over supply owes to the wealth effect as consumers increasingly perceive their capital gains in the stock and housing markets as permanent and, evidently as a consequence, spend part of them, an issue to which I shall return shortly. There can be little doubt that, if the pool of job seekers shrinks sufficiently, upward pressures on wage costs are inevitable, short – as I have put it previously – of a repeal of the law of supply and demand. Such cost increases have invariably presaged rising inflation in the past, and presumably would in the future, which would threaten the economic expansion. By themselves, neither rising wages nor swelling employment rolls pose a risk to sustained economic growth. Indeed, the Federal Reserve welcomes such developments and has attempted to gauge its policy in recent years to allow the economy to realize its full, enhanced potential. In doing so, we must remain concerned with evolving shorter-run imbalances that can constrain long-run economic expansion and job growth. With productivity growth boosting both aggregate demand and aggregate supply, the implications for the real market interest rates that are consistent with sustainable economic growth are not obvious. In fact, current real rates, although somewhat high by historical standards, have been consistent with continuing rapid growth in an environment where, as a consequence of greater productivity growth, capital gains and high returns on investment give both households and businesses enhanced incentives to spend. Other Considerations Even if labor supply and demand were in balance, however, other aspects of the economic environment may exhibit imbalances that could have important implications for future developments. For example, in recent years, as a number of analysts have pointed out, a significant shortfall has emerged in the private saving with which to finance domestic investment in plant and equipment and houses. One offset to the decline in household saving out of income has been a major shift of the federal budget to surplus. Of course, an important part of that budgetary improvement, in turn, owes to augmented revenues from capital gains and other taxes that have flowed from the rising market value of assets. Still, the budget surpluses have helped to hold down interest rates and facilitate private spending. The remaining gap between private saving and domestic investment has been filled by a sizable increase in saving invested from abroad, largely a consequence of the technologically driven marked increase in rates of return on U.S. investments. Moreover, in recent years, with many foreign economies faltering, U.S. investments have looked particularly attractive. As U.S. international indebtedness mounts, however, and foreign economies revive, capital inflows from abroad that enable domestic investment to exceed domestic saving may be difficult to sustain. Any resulting decline in demand for dollar assets could well be associated with higher market interest rates, unless domestic saving rebounds. Near-Term Outlook Going forward, the Members of the Federal Reserve Board and presidents of the Federal Reserve Banks believe there are mechanisms in place that should help to slow the growth of spending to a pace more consistent with that of potential output growth. Consumption growth should slow some, if, as seems most likely, outsized gains in share values are not repeated. In that event, businesses may trim their capital spending plans, a tendency that would be reinforced by the higher level of market interest rates that borrowers now face. But with large unexploited long-term profit opportunities stemming from still-burgeoning innovations and falling prices of many capital goods, the typical cyclical retrenchment could be muted. Working to offset somewhat this anticipated slowing of the growth of domestic demand, our export markets can be expected to be more buoyant because of the revival in growth in many of our important trading partners. After considering the various forces at work in the near term, most of the Federal Reserve governors and Bank presidents expect the growth rate of real GDP to be between 3½ and 3¾% over the four quarters of 1999 and 2½ to 3% in 2000. The unemployment rate is expected to remain in the range of the past eighteen months. Inflation, as measured by the four-quarter percent change in the consumer price index, is expected to be 2¼ to 2½% over the four quarters of this year. CPI increases thus far in 1999 have been greater than the average in 1998, but the governors and bank presidents do not anticipate a further pickup in inflation going forward. An abatement of the recent run-up in energy prices would contribute to such a pattern, but policymakers’ forecasts also reflect their determination to hold the line on inflation, through policy actions if necessary. The central tendency of their CPI inflation forecasts for 2000 is 2 to 2½%. Pre-emptive Policymaking In its deliberations this year, the FOMC has had to wrestle with the issue of what policy setting has the capacity to sustain this remarkable expansion, now in its ninth year. For monetary policy to foster maximum sustainable economic growth, it is useful to pre-empt forces of imbalance before they threaten economic stability. But this may not always be possible – the future at times can be too opaque to penetrate. When we can be pre-emptive, we should be, because modest pre-emptive actions can obviate more drastic actions at a later date that could destabilize the economy. I should emphasize that pre-emptive policymaking is equally applicable in both directions, as has been evident over the years both in our inclination to raise interest rates when the potential for inflationary pressures emerged, as in the spring of 1994, or to lower rates when the more palpable risk was economic weakness, as in the fall of last year. This even-handedness is necessary because emerging adverse trends may fall on either side of our long-run objective of price stability. Stable prices allow households and firms to concentrate their efforts on what they do best: consuming, producing, saving, and investing. A rapidly rising or a falling general price level would confound market signals and place strains on the system that ultimately may throttle economic expansion. In the face of uncertainty, the Federal Reserve at times has been willing to move policy based on an assessment that risks to the outlook were disproportionately skewed in one direction or the other, rather than on a firm conviction that, absent action, the economy would develop imbalances. For instance, both the modest policy tightening of the spring of 1997 and some portion of the easing of last fall could be viewed as insurance against potential adverse economic outcomes. As I have already indicated, by its June meeting the FOMC was of the view that the full extent of this insurance was no longer needed. It also did not believe that its recent modest tightening would put the risks of inflation going forward completely into balance. However, given the many uncertainties surrounding developments on both the supply and demand side of the economy, the FOMC did not want to foster the impression that it was committed in short order to tighten further. Rather, it judged that it would need to evaluate the incoming data for more signs that further imbalances were likely to develop. Pre-emptive policymaking requires that the Federal Reserve continually monitor economic conditions, update forecasts, and appraise the setting of its policy instrument. Equity prices figure importantly in that forecasting process because they influence aggregate demand. As I testified last month, the central bank cannot effectively directly target stock or other asset prices. Should an asset bubble arise, or even if one is already in train, monetary policy properly calibrated can doubtless mitigate at least part of the impact on the economy. And, obviously, if we could find a way to prevent or deflate emerging bubbles, we would be better off. But identifying a bubble in the process of inflating may be among the most formidable challenges confronting a central bank, pitting its own assessment of fundamentals against the combined judgment of millions of investors. By itself, the interpretation that we are currently enjoying productivity acceleration does not ensure that equity prices are not overextended. There can be little doubt that if the nation’s productivity growth has stepped up, the level of profits and their future potential would be elevated. That prospect has supported higher stock prices. The danger is that in these circumstances, an unwarranted, perhaps euphoric, extension of recent developments can drive equity prices to levels that are unsupportable even if risks in the future become relatively small. Such straying above fundamentals could create problems for our economy when the inevitable adjustment occurs. It is the job of economic policymakers to mitigate the fallout when it occurs and, hopefully, ease the transition to the next expansion. Century Date Change Preparations I would be remiss in this overview of near-term economic developments if I did not relay the ongoing efforts of the Federal Reserve, other financial regulators, and the private sector to come to grips with the rollover of their computer systems at the start of the upcoming century. While I have been in this business too long to promise that 2000 will open on an entirely trouble-free note, the efforts to address potential problems in the banking and financial system have been exhaustive. For our part, the Federal Reserve System has now completed remediation and testing of all its mission-critical applications, including testing its securities and funds-transfer systems with our thousands of financial institution customers. As we have said previously, while we do not believe consumers need to hold excess cash because we expect the full array of payment options to work, we have taken precautions to ensure that ample currency is available. Further, the Federal Reserve established a special liquidity facility at which sound depository institutions with good collateral can readily borrow at a slight penalty rate in the months surrounding the rollover. The availability of this back-stop funding should make depository institutions more willing to provide loans and lines of credit to other financial institutions and businesses and to meet any deposit withdrawals as this century closes. The banking industry is also working hard, and with evident success, to prepare for the event. By the end of May, 98% of the nation’s depository institutions examined by Federal Financial Institutions Examination Council agencies were making satisfactory progress on their Year 2000 preparations. The agencies are now in the process of examining supervised institutions for compliance with the June 30 milestone date for completing testing and implementation of remediated mission-critical systems. Supervisors also expect institutions to prepare business resumption contingency plans and to maintain open lines of communication with customers and counterparties about their own readiness. The few remaining laggards among financial institutions in Year 2000 preparedness have been targeted for additional follow-up and, as necessary, will be subject to formal enforcement actions. Conclusion As a result of our nation’s ongoing favorable economic performance, not only has the broad majority of our people moved to a higher standard of living, but a strong economy also has managed to bring into the productive workforce many who had for too long been at its periphery. The unemployment rate for those with less than a high-school education has declined from 10¾% in early 1994 to 6¾% today, twice the percentage point decline in the overall unemployment rate. These gains have enabled large segments of our society to obtain skills on the job and the self-esteem associated with work. The questions before us today are what macroeconomic policy settings can best extend this favorable performance. No doubt, a monetary policy focused on promoting price stability over the long run and a fiscal policy focused on enhancing national saving by accumulating budget surpluses have been key elements in creating an environment fostering the capital investment that has driven the gains to productivity and living standards. I am confident that by maintaining this discipline, policymakers in the Congress, in the executive branch, and at the Federal Reserve will give our vital U.S. economy its best chance of continuing its remarkable progress.
|
board of governors of the federal reserve system
| 1,999 | 7 |
Remarks by Mr Roger W Ferguson, Jr, a member of the Board of Governors of the US Federal Reserve System and Chairman of the Joint Year 2000 Council, before the George Washington University Y2K Group in Washington, D.C. on 29 July 1999.
|
Mr Ferguson reviews the progress on Year 2000 readiness and focuses on public disclosure and public confidence in the United States Remarks by Mr Roger W Ferguson, Jr, a member of the Board of Governors of the US Federal Reserve System and Chairman of the Joint Year 2000 Council, before the George Washington University Y2K Group in Washington, D.C. on 29 July 1999. * * * As the countdown to the Year 2000 continues, and with only five months to go until the new year, it is certainly prudent for us to focus now on the Year 2000 readiness of our nation and, indeed, the entire global community. So I applaud the Research Program in Social and Organizational Learning here at George Washington University for sponsoring this wide-ranging conference. Over the course of several days, this assembly has discussed important issues in many sectors of our economy. These Year 2000 discussions, and others like them around the nation, will help to create public awareness and understanding, which are very important building blocks in the structure of public confidence. I will come back to the confidence issue in a few minutes, but I will commence with a review of progress to date. Preparedness of the Domestic Banking Industry First I want to provide today some perspectives on the Year 2000 readiness of the US banking sector. At the Federal Reserve we are closely involved with preparations for the century date change in the financial sector here and around the world. We expect there will be glitches. Nothing this complex can be completely without fault. However, I want to confirm at the outset that the Federal Reserve is ready for the Year 2000, and every current indication is that the domestic banking system is, for all practical purposes, also ready for a smooth transition. Much of my emerging confidence results from the fact that the Federal Reserve System is fully prepared for the Year 2000; 100% of the Federal Reserve’s mission-critical systems are ready. At the Federal Reserve Board and the 12 District Banks, we are using today the automated systems that we will use in the Year 2000. In fact, we have been testing these systems for more than a year with banks and thrifts that are linked to us around the country for payments and related functions. These tests have gone extremely well. This is significant because the Federal Reserve Banks operate the hub of the nation’s payments system, providing depository institutions with essential services in cash, checks, and electronic payments. Thus, because of the Federal Reserve’s readiness, Americans can have confidence that the nation’s basic payments infrastructure is sound and ready to process payments as usual before and after the century date change. Another fact that gives me confidence in the domestic financial sector’s preparations for the century date change is that almost all of the nation’s banks, thrift institutions, and credit unions are ready for Year 2000. The Federal Deposit Insurance Corporation (FDIC) reported last month that 98.3% of insured institutions were ready. These good evaluations reflect years of diligent work by thousands of large and small institutions alike to meet rather aggressive deadlines for mission-critical systems set by the regulators. As you may know, banks were required to meet certain milestones for identifying, renovating, implementing and testing their mission-critical automated systems to ensure that their customers would receive the quality and timeliness of service they expect after the century date change. The small handful of institutions that have not yet met the regulatory milestones are, I assure you, receiving the full attention of the regulators. Combined with the readiness of the Federal Reserve to support the nation’s payments infrastructure, the readiness of commercial banks means that we can expect that the processing systems that support all methods of payment – ATM cards, debit cards, credit cards, direct deposit and other electronic payments, checks, and even cash – will work smoothly around the date change. The readiness of the Federal Reserve and the extensive preparations in the banking system do not, however, guarantee perfection. No one can say that there will be no problems in the banking system when the century date rolls over. As we are all aware, processing and other failures do occur in complex modern automated systems, often in ways that are rarely visible to the public. We fix these annoying minor faults every day and go on, and this is what we will do at the time of the century date change. Moreover, if isolated glitches do emerge, the Federal Reserve and depository institutions will draw on their contingency plans to get systems operating again as promptly as possible. At the Federal Reserve, we have extensive contingency plans, many of them already well tested during hurricanes, blizzards, and other events. It is important to emphasize that we believe such contingency planning to be an important element of planning and preparedness for a wide variety of possible events. It does not mean that we think particular problems are likely. In general, we encourage all banking institutions to undertake appropriate contingency planning, so they will be ready just in case there is an unexpected or unusual set of events. For the special circumstances of the century date change, we have developed additional wide-ranging plans to deal with contingencies of all kinds, even though we expect the banking system to operate normally. Let me give you some examples. One recent example of our contingency planning is the establishment of special liquidity arrangements for borrowing by depository institutions. As lender of last resort, the Federal Reserve always is ready to provide short-term loans to banks and thrifts that experience temporary liquidity needs. For additional liquidity, the Federal Reserve announced a plan to help healthy institutions meet credit demands late this year and in the first quarter of 2000 should they face short-term liquidity problems. This borrowing arrangement, by its very existence, even if it is not widely used, should provide assurance to the public and banks that the lender of last resort will be there to provide liquidity should banks need it during the rollover to 2000. To be prepared in case our fellow citizens choose to have a little extra cash on hand for the century rollover – it will be a long holiday weekend for many, after all – the Federal Reserve is building an inventory of currency in our vaults. This inventory will be more than enough to meet any conceivable demand for cash. Given the high expectation that all normal payments methods will work, we do not expect that there will be any unusual demand for cash, but if there is, we will be ready to meet it. Just as the Federal Reserve has planned for even very remote contingencies, individual banks and thrifts also have been required to extend existing contingency plans to address new risks posed by Year 2000 and test these plans. These plans are meant to ensure that customers can be confident they will have access to their funds during the century date rollover period. For example, each institution was required to develop a liquidity plan describing how it would meet demands for cash from its customers and how it would meet the credit needs of its community. For some institutions, getting extra cash from the Federal Reserve or borrowing in the financial markets or from the Federal Reserve may be part of their plans. Having detailed the readiness and contingency planning of the Federal Reserve and the entire US banking sector, I hope you can understand why I am increasingly confident that generally normal operations will prevail in this country’s financial institutions during the century date change. International Financial Sector Developments Turning now to the international arena, I am encouraged at the progress being made this year by most foreign financial regulators and the largest financial institutions to meet the Year 2000 challenge, but I am somewhat less certain of the international preparations than I am of our domestic readiness. Let me emphasize that I am somewhat less certain, not fearful. Given the wide variety of institutions and systems around the world, and obvious difficulties in systematically collecting definitive and current information regarding Year 2000 readiness, no one can know for certain all of the international outcomes of the century date change. However, in my role as Chairman of the Joint Year 2000 Council, which is an international group of financial regulators representing banking, insurance, and securities regulators, as well as central bankers and payment system overseers, I have information that helps bring some of the issues into clearer focus. The preparation of our domestic payments links is bolstered by the fact that foreign financial services firms are generally believed to be among the best prepared in their respective countries. We do know that the largest, most internationally active firms are making good progress toward preparedness. They are forced to do so in order to remain viable in their home markets and globally. I also know that awareness among financial regulators is high. The Joint Year 2000 Council has held meetings in all regions of the globe. These regional meetings have been attended by regulators from over 100 countries. Similarly, the United Nations hosted a meeting for Year 2000 Coordinators in June, which was attended by representatives of more than 170 countries. These public sector efforts are mirrored by regional meetings held by leading members of the private sector. Surveys of international financial regulators, as well as my own conversations with many of them, also have demonstrated that a recognition of the problem has been increasingly translated into concrete plans and actions, many of which have gathered momentum in the last six months. Another positive perspective flows from the large and very successful test of domestic and crossborder payments systems last month. In this test using a simulated Year 2000 environment, more than 500 financial market participants from 19 countries were able to complete transactions in 34 national and international payment systems. The test included sending and receiving payment instructions, and payment settlements. There were virtually no errors, and the few problems discovered were remedied quickly. That cross-border test is one example of a broader development. We are getting increasingly positive information regarding the preparedness of core financial systems overseas, importantly including payment and settlement systems, that could, if not prepared, trigger difficulties in the world’s financial operations. These systems have generally implemented extensive internal testing and most are testing with participants as well. The large test last month does seem to confirm that Year 2000 readiness has improved in many of the more important financial sectors around the world. Avoiding Complacency We should not let these recent successes lead to complacency, either domestically or internationally. Domestically, institutions should continue to make preparations for the rollover period through event management and contingency planning, and in a very few cases by further remediation work. Most importantly, this should include maintaining an active effort to keep customers appropriately informed. Because of the important role of public confidence in sustaining financial stability during an event such as the Year 2000 rollover, in which there are inevitable uncertainties, persistent and effective public communication is essential. Full and fact-based information will certainly help individuals maintain perspective and encourage them to avoid taking needless risks. On the other hand, incomplete information or misunderstanding may cause some to take risks such as shifting deposits from banks, or withdrawing large sums of cash, or even making unreasoned decisions about other assets they hold. Without solid perspective about Year 2000 preparations, others may fall victim to Year 2000-related frauds or be asked to buy Year 2000 products and services of questionable value. Similarly, there is still work to be done internationally. First, while the financial sector worldwide is considered to be ahead of most other sectors, it is generally thought that the United States still leads most other countries in the degree of preparedness, although some are as advanced as we are. For example, testing programs and, in some cases, further remediation efforts still need to be made in some countries, although these activities are obviously quite advanced in our major trading partners and the other major industrialized economies. Second, all countries should hold themselves to the highest standard of self-disclosure, so that financial markets can make fully informed decisions. This need for disclosure includes industrial countries, as well as developing countries. Similarly, financial market participants need to seek full information on international preparedness, and make reasonable, calm and considered – not hasty – trade-offs between risk and reward. Now that it seems as though the developed world is generally well positioned in dealing with the Year 2000 problem, the focus has shifted to the “emerging market countries”. This group is described by some to be at the greatest risk of failure due to technical difficulties. My concern is that “emerging market countries” is too broad a group of nations, numbering well over 100. It is important that emerging and newly industrialized economies making good progress toward preparedness, and there are many, disclose that as clearly as possible. Market participants should look for and make those disclosures part of their decisionmaking. Those “emerging market economies” that are not making satisfactory progress should disclose their status and seek assistance as they work to remediate mission-critical systems and engage in contingency planning. In particular, financial institution preparedness can be hampered by a lack of preparedness in critical infrastructure – telecommunications, power, and water. This cannot be taken for granted. I would, however, note that many of the countries that may be the farthest behind are also those that are least dependent upon technology. Similarly, many of these countries have the fewest links to the international financial markets, and their performance during the century date change is least likely to have a significant cross-border impact. Finally, good contingency planning, including manual work-arounds may avoid serious problems. Ultimately, however, systems will have to be fixed or replaced, whether before the century date change or after, and the sooner the process is started, the better. Conclusion To sum up, I believe that the US banking system is largely ready for the Year 2000 and that major foreign financial institutions generally are working diligently to be ready as well by 1 January. Thus, I believe that the Year 2000 in the financial sector of the United States and many other countries is really less of a technical issue now than it was last year at this time. Moreover, the relevant regulators here and around the world know what needs to be done over the next five months, and are communicating with the business leaders managing the world’s financial institutions. The progress we see in the global financial sector and the evidence from successful cross-border and domestic tests are also positive signs. Thus, with the technical and business elements generally making good progress, the real issue remaining for the United States and other leading countries is that of public awareness, understanding and, ultimately, public confidence. We should all recognize that the Year 2000 event is unique in that we all know that it will occur, but exactly what will occur is uncertain. In this environment, we should listen to the most reasonable and responsible sources. Part of maintaining public confidence is fact-based disclosure both to the public at large and to market counterparties. Disclosure still remains an issue for some, particularly as “emerging market economies” seek to distinguish those making good progress from those that need to increase efforts. Many emerging market and newly industrialized countries are making good progress. For some countries, technical challenges may remain, particularly for those that recognized this problem relatively recently. Finally, we know that there are likely to be some glitches, which I would expect to be small and of short duration. Similarly, I would expect any international disturbances to be limited in terms of the number of institutions and countries affected. In order to achieve that outcome, however, some countries and institutions should focus on mission-critical systems, and all should engage in contingency planning as the key activity now. By remaining dedicated for the next five months we can all ensure that the work of the last several years is fully completed, and any remaining risks are minimized. I believe that those who have been working on this effort will rise to the challenge.
|
board of governors of the federal reserve system
| 1,999 | 8 |
Speech given by the Chairman of the Board of Governors of the US Federal Reserve System, Mr Alan Greenspan, before a symposium sponsored by the Federal Reserve Bank of Kansas City in Jackson Hole, Wyoming on 27 August 1999.
|
Mr Greenspan discusses new challenges for monetary policy Speech given by the Chairman of the Board of Governors of the US Federal Reserve System, Mr Alan Greenspan, before a symposium sponsored by the Federal Reserve Bank of Kansas City in Jackson Hole, Wyoming on 27 August 1999. * * * I should like as a backdrop to this conference on the challenges confronting monetary policy to focus on certain aspects of one of the issues that will be more broadly discussed later this morning: asset pricing and macroeconomic performance. As the value of assets and liabilities have risen relative to income, we have been confronted with the potential for our economies to exhibit larger and perhaps more abrupt responses to changes in factors affecting the balance sheets of households and businesses. As a result, our analytic tools are going to have to increasingly focus on changes in asset values and resulting balance sheet variations if we are to understand these important economic forces. Central bankers, in particular, are going to have to be able to ascertain how changes in the balance sheets of economic actors influence real economic activity and, hence, affect appropriate macroeconomic policies. At root, all asset values rest on perceptions of the future. A motor vehicle assembly plant is a pile of junk if no participants in a market economy perceive it capable of turning out cars and trucks of use to consumers and profit to producers. Likewise, the scrap value at the end of the plant’s service life will be positive only if it is convertible into usable products. The value ascribed to any asset is a discounted value of future expected returns, even if no market participant consciously makes that calculation. In principle, forward discounting lies behind the valuation of all assets, from an apple that is about to be consumed to a hydroelectric plant with a hundred-year life expectancy. On such judgments of value rest much of our economic system. Doubtless, valuations are shaped in part, perhaps in large part, by the economic process itself. But history suggests that they also reflect waves of optimism and pessimism that can be touched off by seemingly small exogenous events. This morning, I plan to address some of the problems that arise in evaluating the prices of equities. I should like to first focus on some significant difficulties of profit accounting that impede judgments about prospective earnings. In particular, there are some difficulties that have become more severe as a consequence of the recent acceleration of technologies, which, in turn, are markedly altering patterns of economic organization and production. And then I will discuss a different set of forces that mold the development of discount factors which, together with earnings projections, produce estimates of market value. First, the rapid shift in the composition of gross domestic product towards idea-based value added is muddying our measures of current earnings and, hence, our projections of future earnings. The key definitional question that must be confronted is, what is a capital outlay? Conversely, what is an expense that, by definition, is consumed in the process of production and deemed an intermediate product? This issue is most immediately evident in accounting for software outlays, but it is rapidly expanding to a much broader range of activities. Software that is embedded in capital equipment, and some that is stand-alone, is currently being capitalized and consequently amortized against current and future earnings. But a substantial portion of software spending is expensed, even though the equity prices of the purchasing companies are clearly valuing the software outlays as contributing to earnings over their useful economic lives – the relevant criterion for capitalizing an asset. There has always been a fuzzy dividing line between what is expensed and what is capitalized. This has historically bedeviled the accounting for research and development, for example. But the major technological advances of recent years have exposed a wide swath of rapidly growing outlays that, arguably, should be capitalized so that the returns they produce would be more accurately reflected as earnings over time. Indeed, there is even an argument for capitalizing new ideas, such as different ways of organizing production, that enhance the value of a firm without any associated outlays. Some analysts judge the size of undercapitalized outlays as quite large.1 The important point, however, is that decisions about which items to expense will have important consequences for reported earnings. In general, if the trend of expensed items that should be capitalized is rising faster than reported earnings, switching to capitalizing these items will almost always accelerate the growth in earnings. The reverse, of course, is also true. But the newer technologies, and the productivity and bull stock market they have fostered, are also accentuating some accounting difficulties that tend to bias up reported earnings. One is the apparent overestimate of earnings that occurs as a result of the distortion in the accounting for stock options. The combination of not charging their fair value against income, and the practice of periodically repricing those options that fall significantly out of the money,2 serves to understate ongoing labor compensation charges against corporate earnings. This distortion, all else equal, has overstated growth of reported profits according to Fed staff calculations by 1 to 2 percentage points annually during the past five years. Similarly, the rise in stock prices, which reduces corporate contributions to pension funds is also augmenting reported profits. These upward adjustments in reported earnings, of course, are a consequence of rising stock prices and, hence, may not be of the same dimension in the future. Nonetheless, it is reasonable to surmise that undercapitalized expenses have been rising sufficiently faster than reported earnings to have more than offset the factors that have temporarily augmented reported earnings. It does not seem likely, however, even should all of the appropriate accounting adjustments to earnings be made, that such adjustments can be the central explanation of the extraordinary increase in stock prices over the past five years. However we calculate profits and capital, shifts in the stock market value of firms will doubtless continue to remain important influences on our economies. It is thus incumbent on us to improve our understanding of the process by which projections of future earnings are translated into asset market value. Even our most sophisticated analytic techniques have difficulty dealing with the interactions among time preference, risk aversion and uncertainty, and with the implications of these interactions for the risk premiums that are embedded in asset prices. It is our failure to anticipate changes in this discounting process that much of our inability to accurately forecast economic events lies. For example, the dramatic changes in information technology that have enabled businesses to embrace the techniques of just-in-time inventory management appear to have reduced that part of the business cycle that is attributable to inventory fluctuations and, accordingly, may well have been a factor in the apparent decline in equity premiums that has characterized the latter part of the 1990s. Whether the decline in these premiums themselves may foster activities that could result in wider business cycles, as some maintain, is an open question. As model builders know, all economic channels of influence are not of equal power to engender growth or contraction. Of crucial importance, and still most elusive, is arguably the behavior of asset markets. More broadly, there is an increasing need to integrate into our macro models more complete descriptions of the responses of households and businesses to risk – behaviors that are generally modeled separately under the rubric of portfolio risk management. For example, Erik Brynjolfsson and Shinkyu Yang, “The Intangible Costs and Benefits of Computer Investments: Evidence from the Financial Markets,” MIT Sloan School, mimeo, April 1999. The Financial Accounting Standards Board (FASB) will require that the cost of repricing of options be charged against income starting later this year. The translation of value judgments into market prices is, of course, rooted in how people discount uncertain future outcomes. An individual’s degree of risk aversion may vary through time and possibly be subject to herd instincts. Nonetheless, certain stable magnitudes are inferable from the process of discounting of future claims and values. One of the most enduring is that interest rates, as far back as we can measure, appear trendless, despite vast changes in technology, life expectancy and economic organization. British long-term government interest rates, for example, mostly ranged between 3% and 6% from the early eighteenth century to the early twentieth century, and are around 5% today. Indeed, scattered evidence dating back to ancient Rome and before reflects the same order of interest rate magnitude, not a 1% interest rate nor 200%. This suggests that the rate of time preference underlying interest rates, like so many other aspects of human nature, has not materially changed over the generations. But while time preference may appear to be relatively stable over history, perceptions of risk and uncertainty, which couple with time preference to create discount factors, obviously vary widely, as does liquidity preference, itself a function of uncertainty. The impact of increasing uncertainty and risk aversion was no more evident than in the crisis that gripped financial markets last autumn, following the Russian default. That episode of investor fright has largely dissipated. But left unanswered is the question of why such episodes erupt in the first place. It has become evident time and again that when events are unexpected, more complex, and move more rapidly than is the norm, human beings become less able to cope. The failure to be able to comprehend external events almost invariably induces fear and, hence, disengagement from an activity, whether it be entering a dark room or taking positions in markets. And attempts to disengage from markets that are net long – the most general case – means bids are hit and prices fall. Modern quantitative approaches to risk measurement and risk management take as their starting point historical experience with market price fluctuations, which is statistically summarized in probability distributions. We live in what is, for the most part, a stable economic system, where market imbalances that produce unusual outcomes almost always give rise to continuous and inevitable moves back towards longer-run equilibrium. However, the violence of the responses to what seemed to be relatively mild imbalances in Southeast Asia in 1997 and throughout the global economy in August and September 1998 has illustrated yet again that the adjustments in asset markets can be discontinuous, especially when investors hold highly leveraged positions and when views about long-term equilibria are not firmly held. Enough investors usually adopt strategies that take account of longer-run tendencies to foster the propensity for convergence toward equilibrium. But from time to time, this process has broken down as investors suffer an abrupt collapse of comprehension of, and confidence in, future economic events. It is almost as though, like a dam under mounting water pressure, confidence appears normal until the moment it is breached. Risk aversion in such an instance rises dramatically, and deliberate trading strategies are replaced by rising fear-induced disengagement. Yield spreads on relatively risky assets widen dramatically. In the more extreme manifestation, the inability to differentiate among degrees of risk drives trading strategies to ever-more-liquid instruments so investors can immediately reverse decisions at minimum cost should that be required. As a consequence, even among riskless assets, such as U.S. Treasury securities, liquidity premiums rise sharply as investors seek the heavily traded “on-the-run” issues – a behavior that was so evident last fall. History tells us that sharp reversals in confidence happen abruptly, most often with little advance notice. These reversals can be self-reinforcing processes that can compress sizable adjustments into a very short time period. Panic market reactions are characterized by dramatic shifts in behavior to minimize short-term losses. Claims on far-distant future values are discounted to insignificance. What is so intriguing is that this type of behavior has characterized human interaction with little appreciable difference over the generations. Whether Dutch tulip bulbs or Russian equities, the market price patterns remain much the same. We can readily describe this process, but, to date, economists have been unable to anticipate sharp reversals in confidence. Collapsing confidence is generally described as a bursting bubble, an event incontrovertibly evident only in retrospect. To anticipate a bubble about to burst requires the forecast of a plunge in the prices of assets previously set by the judgments of millions of investors, many of whom are highly knowledgeable about the prospects for the specific companies that make up our broad stock price indexes. If episodic recurrences of ruptured confidence are integral to the way our economy and our financial markets work now and in the future, it has significant implications for risk management and, by implication, macroeconomic modeling and monetary policy. Probability distributions that are estimated largely, or exclusively, over cycles excluding periods of panic will underestimate the probability of extreme price movements because they fail to capture a secondary peak at the extreme negative tail that reflects the probability of occurrence of a panic. Furthermore, joint distributions estimated over periods without panics will misestimate the degree of correlation between asset returns during panics. Under these circumstances, fear and disengagement by investors often result in simultaneous declines in the values of private obligations, as investors no longer realistically differentiate among degrees of risk and liquidity and increases in the values of riskless government securities. Consequently, the benefits of portfolio diversification will tend to be overestimated when the rare panic periods are not taken into account. As we make progress, hopefully, towards understanding asset-pricing mechanisms, we need also to upgrade our insights into the effect of changing asset values on GDP – the so-called wealth effect. Although many aspects of this issue deserve attention, let me cite a few open questions of particular importance. Efforts to differentiate between realized and unrealized gains, and the propensity to leverage both, may afford a deeper understanding of the consequences of asset price change. And differentiating between gains that arise from enhanced profitability and those that reflect changes in discount factors may also be useful. The former may be more likely to be sustained, given the tendencies of discount factors to revert back to historic norms. Moreover, it is evident that borrowings against capital gains on homes influence consumer outlays beyond the effects of gains from financial assets. Preliminary work at the Federal Reserve suggests that the extraction of equity from housing has played an important role in recent years. However, stock market values and capital gains on homes are correlated and, hence, their separate effects are difficult to identify. This is an area that clearly warrants further examination. Finally, in the business sector, questions remain about the influence of equity prices on investment spending. In particular, do all equity price movements – whether related to fundamentals or not – have the same effect on investment spending? In conclusion, the issues that I have touched on this morning are of increasing importance for monetary policy. We no longer have the luxury to look primarily to the flow of goods and services, as conventionally estimated, when evaluating the macroeconomic environment in which monetary policy must function. There are important – but extremely difficult – questions surrounding the behavior of asset prices and the implications of this behavior for the decisions of households and businesses. Accordingly, we have little choice but to confront the challenges posed by these questions if we are to understand better the effect of changes in balance sheets on the economy and, hence, indirectly, on monetary policy.
|
board of governors of the federal reserve system
| 1,999 | 8 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the Millennium Lecture Series, sponsored by the Gerald R. Ford Foundation and Grand Valley State University at Grand Rapids, Michigan on 8 September 1999.
|
Mr Greenspan discusses developments crucial to understanding the impressive economic record of the United States Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the Millennium Lecture Series, sponsored by the Gerald R. Ford Foundation and Grand Valley State University at Grand Rapids, Michigan on 8 September 1999 * * * Thank you for your kind welcome to Grand Valley State University and the Ford Museum Millennium Lecture Series. Over the past quarter-century I have appeared on many platforms with President Ford. He never seems to change, but I keep losing my hair. Those of us who had the privilege to work closely in the White House with the 38th President of the United States gained a respect for his wisdom and integrity, not all of which came from his obvious good judgment in marrying Betty. Hence it has been a special pleasure for me, and my Ford Administration colleagues, to see our view of Gerald Ford increasingly being shared by the American people and the rest of the world. When I was invited to participate in this series of lectures complementing the remarkable exhibit entitled “The American Century,” I was told to talk about anything I’d like. You will be pleased to know that I resisted the temptation to prepare a discourse on the statistical discrepancy between gross domestic product and gross domestic income. Instead, given that we are at the tipping point of a new century, I decided to speak about several developments that strike me as crucial to understanding this country’s rather impressive economic record. It is safe to say that we are witnessing this decade, in the United States, history’s most compelling demonstration of the productive capacity of free peoples operating in free markets. I said earlier this year that members of the graduating class of 1999 are being bequeathed the tools for achieving a material existence that neither my generation nor any that preceded it could have even remotely imagined as we began our life’s work. The quintessential manifestations of America’s industrial might earlier this century – large steel mills, auto assembly plants, petrochemical complexes, and skyscrapers – have been replaced by a gross domestic product that has been downsized as ideas have replaced physical bulk and effort as creators of value. Today, economic value is best symbolized by exceedingly complex, miniaturized integrated circuits and the ideas – the software – that utilize them. Most of what we currently perceive as value and wealth is intellectual and impalpable. The American economy, clearly more than most, is in the grip of what the eminent Harvard professor Joseph Schumpeter many years ago called “creative destruction,” the continuous process by which emerging technologies push out the old. Standards of living rise when incomes created by the productive facilities employing older, increasingly obsolescent, technologies are marshaled to finance the newly produced capital assets that embody cutting-edge technologies. This is the process by which wealth is created, incremental step by incremental step. It presupposes a continuous churning of an economy as the new displaces the old. Although this process of productive obsolescence has ancient roots, it appears to have taken on a quickened pace in recent years and changed its character. The remarkable, and partly fortuitous, coming together of the technologies that make up what we label IT – information technologies – has begun to alter, fundamentally, the manner in which we do business and create economic value, often in ways that were not readily foreseeable even a decade ago. Before the advent of what has become a veritable avalanche of information technology innovation, most twentieth-century business decisionmaking had been hampered by dated and incomplete information about customer preferences in markets and flows of materials through a company’s production systems. Relevant information was hours, days, or even weeks old. Accordingly, business managers had to double up on materials and people to protect against the inevitable misjudgments that were part and parcel of production planning. Ample inventory levels were needed to ensure output schedules, and backup teams of people and machines were required to maintain quality control and respond to unanticipated developments. Of course, large remnants of imprecision still persist, but the remarkable surge in the availability of real-time information in recent years has sharply reduced the degree of uncertainty confronting business management. This has enabled businesses to remove large swaths of now unnecessary inventory, and dispense with much programmed worker and capital redundancies. As a consequence, growth in output per work hour has accelerated, elevating the standards of living of the average American worker. Intermediate production and distribution processes, so essential when information and quality control were poor, are being bypassed and eventually eliminated. The proliferation of Internet web sites is promising to alter significantly the way large parts of our distribution system are managed. Moreover, technological innovations have spread far beyond the factory floor and retail and wholesale distribution channels. Biotech, for example, is revolutionizing medicine and agriculture, with far reaching consequences for the quality of life not only in the United States but around the world. The explosion in the variety of products of many different designs and qualities has opened up the potential for the satisfaction of consumer needs not evident even a decade or two ago. The accompanying expansion of incomes and wealth has been truly impressive, though regrettably the gains have not been as widely spread across households as I would like. How is this remarkable economic machine to be maintained, and how can we better ensure that its benefits reach the greatest number of people? Certainly, we must foster an environment in which continued advances in technology are encouraged and welcomed. If, as I indicated in a commencement address this spring, the graduates of 1999 are going to be able to build on the accomplishments of their forebears, many of them must push forward to expand our knowledge in science and engineering, and our universities must ready themselves to meet the technical needs of our students yet to come. But scientific proficiency will not be enough. Skill alone may not be sufficient to move the frontier of technology far enough to meet the many challenges that our nation will confront in the decades ahead. And technological advances alone will not buttress the democratic institutions, supported by a rule of law, which are so essential to our dynamic and vigorous American economy. Each is merely a tool, which, without the enrichment of human wisdom, is of modest value. A crucial challenge of education is to transform skills and intelligence into wisdom – into a process of thinking capable of forming truly new insights. But learning and knowledge – and even wisdom – are not enough. National well-being, including material prosperity, rests to a substantial extent on the personal qualities of the people who inhabit a nation. Civilization, our civilization, rests on the presumption of a productive interaction of people engaged in the division of labor, driven by a process economists label comparative advantage. This implies mutual exchange to mutual advantage among free people. To repeat what I said five years ago here in Grand Rapids before the Gerald R. Ford Foundation: institutions are needed that give free play to the inventive capacities of people and effectively promote the translation of conceptual innovations into increased output of goods and services that are the lifeblood of material progress. What these particular institutions should be has not always been as clear as it is today. Much of this past century, in effect, has been a test of whether capitalist institutions or more centrally planned socialist institutions would work better, over the long run, in serving the needs of human society. Specifically, on 9 November 1989 the Berlin Wall came down, symbolizing the end of an experiment in social policy that began more than four decades earlier with the division of the states of Western and Central Europe into market economies and those governed by state central planning. At the end of World War II, as Winston Churchill put it, “From Stettin in the Baltic to Trieste in the Adriatic an iron curtain … descended across the Continent.” The economies on the Soviet side of the “curtain” had been, in the prewar period, similar to the market-based economies on the western side. Over four decades both types of economies developed with limited interaction across the dividing line. It was as close to a controlled experiment in economic systems as could ever be implemented. With the books now closed on this experiment, we of course have learned much about how communist economics works, or, more exactly, does not. How highly inefficient prior to 1989 the economies of Eastern Europe and the former Soviet Union were is best illustrated by the fact that energy consumed per unit of output was as much as five to seven times higher than in the West. Moreover, the exceptionally large amount of resources devoted to capital investment, without contributing to the productive capacity of these economies, suggested that these resources were largely wasted. In addition, such gaps in efficiency actually understated the gap in performance because they failed to take into account the impact of industrial activity on the environment. The market economies of the West have expanded resources to minimize the adverse impact of industrial activity on the environment. No such resource allocation was made in the Soviet bloc, and the cumulative effect of this neglect is appalling. At least for the foreseeable future, the experiment seems to have been concluded overwhelmingly in favor of the free-market capitalist institutions. The bottom line is that coercive societies rarely enhance the state of what we call civilization. But neither do coercive relationships among people. It is decidedly not true that “nice guys finish last,” as that highly original American baseball philosopher, Leo Durocher, was once alleged to have said. I do not deny that many in our society appear to have succeeded in a material way by cutting corners and manipulating associates, both in their professional and in their personal lives. But material success is possible in this world without exploiting others, and clearly, having a reputation for fair dealing is a profoundly practical virtue. We call it “good will” in business and add it to our balance sheets. Trust is at the root of any economic system based on mutually beneficial exchange. In virtually all transactions, we rely on the word of those with whom we do business. Were this not the case, exchange of goods and services could not take place on any reasonable scale. Our commercial codes and contract law presume that only a tiny fraction of contracts, at most, need be adjudicated. If a significant number of business people violated the trust upon which our interactions are based, our court system and our economy would be swamped into immobility. It is not by chance that in nineteenth-century America, many bankers could effectively issue uncollateralized currency because they were able to develop a reputation that their word was their bond. For these institutions to succeed and prosper, people had to trust their promise of redemption in specie. Now, as then, a contractor with a reputation for shoddy work will not prosper long. In today’s world, where ideas are increasingly displacing the physical in the production of economic value, competition for reputation becomes a significant driving force, propelling our economy forward. Manufactured goods often can be evaluated before the completion of a transaction. Service providers, on the other hand, usually can offer only their reputations. The extraordinarily complex machine that we call the economy of the United States is, in the end, made up of human beings struggling to improve their lives. The individual values of those Americans and their reputations will continue to influence the structure of the institutions that support market transactions, as they have throughout our history. Without mutual trust, and market participants abiding by a rule of law, no economy can prosper. Our system works fundamentally on individual fair dealing. We need only look around today’s world to realize how rare and valuable this is. While we have achieved much in this regard, more remains to be done. Considerable progress, for example, has been evident in recent decades in the reduction of racial and other forms of discrimination. But this job is still far from completion. A free-market capitalist system cannot operate fully effectively unless all participants in the economy are given opportunities to achieve their best. If we succeed in opening up opportunities to everyone, our national affluence will almost surely become more widespread. Of even greater import is that all Americans believe that they are part of a system they perceive as fair and worthy of support. Our forefathers bestowed upon us a system of government, and a culture of enterprise, that has propelled the United States to the greatest prosperity the world has ever experienced. The contributions of our national leaders, people like President Ford, have sustained and promoted that culture in the most difficult of circumstances and have given us the tools to improve upon this inheritance in ways that we have yet to imagine.
|
board of governors of the federal reserve system
| 1,999 | 9 |
Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the Philadelphia Council for Business Economics, Federal Reserve Bank of Philadelphia, Pennsylvania on 8 September 1999.
|
Mr Meyer focuses on the implications for the conduct of current monetary policy in the United States Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the Philadelphia Council for Business Economics, Federal Reserve Bank of Philadelphia, Pennsylvania on 8 September 1999. * * * Often the most interesting part of a presentation on a challenging subject – such as the economic outlook and the implications for monetary policy – is the Q&A after the formal remarks. This is especially the case when the audience is well informed and has strong views about the subject – as is always the case with a NABE group. This suggested an innovative approach: dispensing altogether with the formal remarks and going directly to the Q&A. My initial reaction to this insight was: wow – no paper to write, what a relief! My second response was, on the other hand – and isn’t there always an “other hand” when economists start talking – I always remind my audiences at the end of my talks that I am now prepared to “entertain” their questions. They soon learn that can be rather different from answering them. This distinction is important because there are some questions to which I am not prepared to respond. Today I will try to resolve this tension by proceeding directly to the Q&A, but, at the same time, orchestrating the Q&A by first identifying the questions that I am prepared to answer and then answering them. When I have completed this exercise, you are welcome to offer your own answers to my questions or, of course, raise additional questions for me to entertain. I will ask and answer two sets of questions – the first related to the economic outlook and the second to the strategy for monetary policy. In each case, I will focus on the implications for the conduct of current monetary policy. Let me also emphasize that both the questions and the answers reflect my own views and should not be interpreted as the position of the FOMC or the Board of Governors. I. Outlook Q&A 1. Has there been an upturn in productivity growth? What we can confidently say is that, since late 1995, actual productivity growth has increased and, indeed, moved still higher at the end of 1998 and through early 1999. In addition, this improvement cannot be fully accounted for as a normal cyclical improvement because productivity growth increased further after GDP growth stabilized at about 4%. This supports the view that there has been an increase in trend productivity growth. In addition, the recent improved productivity performance is even more remarkable, given that the decline in the unemployment rate to almost a 30-year low might have been expected to result in employment of lower-skilled and hence less-productive workers. Before discussing possible sources and implications of this development, let me clarify what I mean by trend productivity growth. First, I use it to refer to the maximum sustainable rate of growth in productivity, once the economy has reached capacity. Looking backward, it can be measured by de-cycling actual productivity growth or by computing actual productivity growth over periods long enough to wash out the effect of the business cycle. Extrapolating the trend rate forward, it is the rate of actual productivity growth that would be expected if the economy remained at full employment. Trend productivity growth has varied widely over the post-war period, from more than 3% in the 1960s to close to 1% in the decade preceding this expansion. Because actual productivity is highly pro-cyclical and because there is considerable noise in quarterly observations, it is not easy to discern immediately changes in the underlying trend. This analytical problem has been further complicated in the current episode by unexpectedly weak productivity growth earlier in the expansion. Recent data are consistent with an increase in the productivity trend, but given that the evidence for an upturn in the productivity trend is so recent, considerable uncertainty remains about what the underlying trend will be going forward. Many forecasters, myself included, have been revising upward their estimate of trend productivity growth in response to recent unexpectedly strong productivity growth. Many of these estimates are now about 2% or slightly higher. Part of the increase – about ¼ percentage point – from the earlier trend of just above 1% reflects improved measurement and is not in fact an acceleration of true productivity. Beyond that, an estimate of a 2% trend implies the trend rate of productivity growth has increased more than ½ percentage point. I do not, however, want to put too much emphasis on that point estimate, given the uncertainty that I believe surrounds the calculation. But an incremental change of even ½ percentage point is really enormous if it is sustained over several decades. The source of such an improvement is some combination of capital deepening and a faster rate of technical advance. It appears that the ratio of capital services to labor has been rising appreciably in the past few years as a result of the prevailing high level of net investment. This likely accounts for some portion of the recent improvement in labor productivity. In addition, some portion of this improvement may also reflect a higher rate of technical advance, related to the rapid pace of innovation in information and communication technology and other high-tech contributions to production. The implications of such an acceleration in productivity are profound, at least if the increase is sustained going forward for a considerable period. Productivity is, of course, a close relative of real income per capita, a widely used measure of living standards. Higher productivity growth therefore means a faster rate of improvement in living standards. It also means increased tax revenue to the government and enhanced ability of the country to meet its longer-run spending obligations, including Social Security. Of course, at the Federal Reserve, the key question is, what does higher productivity growth mean for monetary policy? First, monetary policy never should target a specific rate of growth in output but rather should adjust to changes in resource utilization rates and inflation. This is because growth itself does not cause inflation; it is excessive utilization rates that are a proximate source of inflation pressures. We sometimes try to capture this by saying that monetary policy should foster the maximum sustainable growth that the economy is capable of achieving. More precisely, once at full employment, monetary policy should accommodate the maximum growth that does not push the economy beyond that point. A monetary policy focused on maintaining price stability has to be careful to avoid stifling unexpected increases in trend growth, specifically by confusing higher trend growth with above-trend growth. In addition, monetary policy cannot accept responsibility for raising trend growth. The only contribution monetary policy can make in this regard is through promoting price stability and thereby reducing the allocative distortions and possible biases against saving and investment associated with inflation. Another reason for avoiding growth as a target for monetary policy is that the preferred growth rate, at any time, depends on prevailing utilization rates. If the economy is at full employment and at the preferred rate of inflation, trend growth will indeed be the preferred outcome. However, if the economy is initially at utilization rates high enough to result in rising inflation over time, the preferred growth outcome will be below trend, allowing some unwinding of the initial excess demand. Therefore, it is always more precise to characterize the monetary policy response in terms of adjustments to changes in prevailing and projected utilization rates rather than in terms of a response to prevailing and projected growth. Unfortunately, it is sometimes difficult to gauge the degree of excess demand associated with a given utilization rate, a subject I will return to later. What are the implications for monetary policy of a step-up in trend productivity growth and, hence, trend real GDP growth? First, such a development would ultimately call for an upward revision to the targets for money growth, so that the money growth targets would remain consistent with an unchanged target for the inflation rate. As a practical matter, however, this is not a serious policy issue because the monetary aggregate targets play only a minor role in the conduct of monetary policy today. At any rate, such an adjustment might be premature today, given the uncertainty about the underlying trend. However, if the apparent higher trend productivity and GDP growth persists, at some point the money growth ranges should be appropriately adjusted. Under the prevailing operational regime of setting a target for the federal funds rate, money growth would automatically adjust to accommodate the higher rate of trend growth, at an unchanged nominal federal funds rate target. Over the longer run, the challenge under an interest-rate regime is to align the real federal funds rate with its new equilibrium value, which is likely to increase with a higher trend rate of productivity growth (due to a higher return on capital that underlies the new equilibrium real interest rate in the economy). The principal challenge to a monetary policy focused on utilization rates is that an unexpected shift in productivity growth, in effect lowers the unemployment rate consistent with stable inflation (the NAIRU) for a while. This allows the economy to operate at a higher utilization rate without inflationary consequences, at least until the higher productivity is fully anticipated in wage bargaining or until productivity growth stops accelerating. Let me explain the source of the decline in the effective NAIRU. Assume that the increase in productivity is not anticipated and therefore does not immediately raise workers’ real wage demands. With unchanged nominal wage demands and higher productivity, firms will experience a decline in unit labor costs. This will initially boost profits. But competition will quickly force the lower costs to be passed through to consumers in lower prices, lowering price inflation relative to nominal wage change. This decline in inflation, in turn, will put some downward pressure on nominal wage gains. The net result is that an unanticipated increase in the rate of growth of productivity is another example of a favorable supply shock, temporarily lowering inflation. It is useful, nevertheless, to distinguish price shocks – such as declines in energy or non-oil import prices – from productivity growth shocks that have their initial effects on costs rather than on prices. During a transitional period following an unexpected increase in the productivity trend, until productivity growth stabilizes and the higher rate becomes anticipated it will be possible to operate at resource utilization rates beyond what is sustainable over the longer run without inflationary consequences. It is perfectly reasonable to take advantage of this opportunity, as long as care is taken to return to more sustainable utilization rates as the disinflationary force of the upward shift in productivity growth dissipates. Of course, policymakers must also weigh the option of “opportunistic disinflation” in such a circumstance – the possibility of reducing inflation toward their long-run target without depressing, even temporarily, resource utilization rates. However, if inflation is already at its target, the option of permitting temporarily higher output and employment clearly dominates. The apparent increase in the productivity trend probably has been an important disinflationary force over the last few years. Some of the benefits in this case have been taken in the form of lower inflation and some in the form of temporarily higher resource utilization rates. An important issue therefore is whether current utilization rates are sustainable, once productivity growth stabilizes. This issue motivated my next question and answer. 2. Is the economy overheated, or is there a threat of overheating? An overheated economy is one operating beyond the point of sustainable capacity and therefore experiencing excess demand in labor and product markets. The importance of the concept follows from the reasonable expectation that excess demand in labor and product markets is a proximate source of higher inflation. So, is the economy overheated? There are two ways to identify such a condition. The first is to find some proxy for excess demand in labor and product markets. We have to satisfy ourselves with proxies because excess demand is not directly measurable. The second is to look for the consequences of overheating, in the form of acceleration in wage gains or prices. Proxies for excess demand are utilization rates in the labor and product markets – such as the unemployment rate in the labor market and capacity utilization in the product market. Capacity utilization is measured only in the “industrial” sector, so it is a narrow measure of generalized excess demand in the product market. Nevertheless, it remains an important proxy for the balance of supply and demand in the product market. At any given time, a lower unemployment rate or higher capacity utilization rate implies greater demand relative to supply in the respective market. But we are also searching for an absolute concept, the point of balance between supply and demand in the respective markets that divides excess demand from excess supply – in effect, the origin in a diagram relating inflation to excess demand and supply. That’s the difficult part, because we observe only absolute unemployment and capacity utilization rates and have to estimate their respective “natural” rates, the levels consistent with balance between supply and demand in the respective markets. Worse still, the balance point we want to identify is not fixed, but shifts over time. Obviously, the more stable the natural rates are, the more useful is the concept in forecasting and policy analysis. The less stable they are, the more uncertainty we will have at any point about the underlying degree of excess demand. We have no choice but to estimate the natural rates from equations that try to capture inflation dynamics. That’s all the Phillips Curve is – an equation that relates nominal wage change to expected inflation and excess demand in the labor market, proxied by the unemployment rate relative to an implicit estimate of the NAIRU, derived directly from the estimation of the equation. We have encountered several difficulties in applying this framework in the recent period. First, based on equations for wage dynamics, the evidence suggests some decline in the NAIRU and more uncertainty about its current value. Second, the signals about excess demand coming from labor and product markets – that is from unemployment and capacity utilization rates – have diverged to an unusual degree, making an assessment of the overall degree of excess demand in the economy still more difficult. Third, the economy has been subject to powerful price shocks in this episode – including significant swings in oil prices and exchange rates. Such shocks are a second proximate source of movements in inflation, and their presence further complicates the identification of the signal from excess demand as well as the assessment of what utilization rates will trigger rising inflation in the near term. Fourth, it appears that there has been an upturn in the productivity trend, which, as noted above, acts as a disinflationary force for a period of time, further masking underlying excess demand and further complicating the assessment of sustainable utilization rates. So, where does all this put us in terms of proxies for excess demand? The answer is that it leaves us with considerable uncertainty. We’re in an environment where reasonable people can disagree about whether or not there is currently excess demand in the economy and, given that uncertainty, whether or not we ought to use evidence based on estimates of excess demand directly in the conduct of monetary policy. It also puts a priority on developing a strategy that takes account of possible shifts in and uncertainty about both productivity growth and the NAIRU. Although it appears that there is excess demand in the labor market, its effect has been diminished by the combination of the absence of corresponding excess demand in the product markets, the residue of the long period of reinforcing favorable price shocks, and the force of the unexpected acceleration in trend productivity. As the favorable price shocks dissipate or reverse and once trend productivity growth stabilizes, there is a risk that excess demand in the labor market will put the economy on a path of rising inflation, unless growth slows enough to unwind the excess demand before inflation begins to move upward. Given the momentum in sales and expectations for a stronger pace of inventory building in the second half, the consensus is that growth will rebound in the second half to trend or above, though we have not yet seen the effects on spending of the rise in bond rates and the flattening of equity prices since the spring. This should help to slow the growth of domestic demand. Although there is some risk that growth could remain above trend and therefore aggravate any initial excess demand, a major concern remains that the prevailing balance of supply and demand in the labor market might put upward pressure on inflation, even if growth slows to trend ahead. Let me briefly comment on the second indicator of excess demand. Instead of trying to measure the state of balance or imbalance between supply and demand, we could focus on observing the consequence of excess demand – specifically, increases in prices – or, for a given initial inflation rate, increases in inflation. Unfortunately, because of supply shocks, we cannot always make this identification so easily. I read the recent inflation data as at least suggesting that the underlying inflation rate is stabilizing, after a period of decline, without any evidence of a broad-based upturn in inflation. Nominal wage increases have moderated since the middle of 1998, likely reflecting the decline in inflation associated with a combination of favorable supply shocks, including the unexpected increase in the productivity trend. Some of the most recent data suggest that the growth of nominal compensation is no longer declining, and there are hints in the data and in anecdotes that wage pressures may be building. How should monetary policy respond to increasing utilization rates? Should real interest rates be held constant until utilization rates increase above some threshold, for example, or should real interest rates be more smoothly pro-cyclical, gradually increasing in response to rising utilization rates? That is simply a question of what systematic policy response works best to promote the dual objectives of monetary policy: promoting price stability and damping fluctuations around full employment. My judgment is that a regime in which there is a gradual, systematic pro-cyclical response of real interest rates is the one that produces the best trade-off between inflation and output variability. This is the kind of response embodied in the Taylor Rule, for example, though in practice, as we have seen, implementation of this approach is complicated by uncertainties about the level of the NAIRU or its cousin, the output gap. I will return to this problem when I take up the question of how preemptive monetary policy can and should be. 3. Are equities overvalued, so that the economy is threatened with an asset market bubble? This is perhaps the most-asked question I get, so I thought I would preempt you and answer this one directly – though you may decide that I chose to entertain this question! Equity prices have increased enormously over the past four years, to levels that challenge previous valuation standards. Let me make clear at the outset that I honestly do not know whether or not equities are fairly valued or overvalued. I have nothing to share with you about this question. What I do want to share with you is how the equity market fits into my thinking about monetary policy. Those of us fortunate enough to attend this year’s Jackson Hole Conference, sponsored by the Federal Reserve Bank of Kansas City, had plenty of opportunity to think about and discuss this issue. Most important, policymakers should reflect the higher value of equities in their forecasts for aggregate spending and adjust monetary policy as necessary to remain consistent with the broad objectives of monetary policy. The key here is to remain focused on broad macroeconomic performance, responding indirectly to the movements in equity prices – whether the higher value of equities appears driven by fundamentals or otherwise – rather than to use policy directly to influence the value of equities. If policy is disciplined in pursuit of its broad macroeconomic objectives, this will reduce (though not eliminate) the prospect that equities will become significantly overvalued. There are, nevertheless, several steps that policymakers might consider if they have suspicions that equities might be overvalued. They could build some assumption about a market correction into their forecast. That would seem reasonable but could be a mistake. Specifically, it could discourage them from tightening in response to robust demand driven in part by past increases in market values, counting instead on an autonomous correction in equity values, the degree and timing of which has to be extremely uncertain. On the other hand, given such suspicions, policymakers should be alert to the potential that a tightening of policy could have a disproportionate effect on demand, if it induces a reassessment of market fundamentals. This does not mean, however, that policymakers are trapped and cannot respond to robust demand and rising inflation risks. It does suggest that they should appreciate that there will be more uncertainty about the magnitude of the effect of a given policy tightening and that the effect could be disproportionately large. While the stock market should not be a target for monetary policy, policymakers should pay attention to the signals from the market. An aggressive rise in equity prices can be a sign of highly favorable financial conditions in general and a highly accommodative monetary policy in particular. If this occurs when the economy is already near potential, policymakers should re-evaluate the appropriateness of their policy setting in terms of promoting price stability and damping fluctuations around full employment. Finally, policymakers should be alert to the need to respond appropriately to a significant market correction. It is important to note that policymakers should not target the level of equity prices on the way down any more than on the way up but, in both directions, should take the movement in equity prices into account in their forecast and hence in the setting of monetary policy. II. Monetary policy Q&A 1. Does prevailing uncertainty about the structure of the economy and the recent forecast errors diminish the ability of monetary policy to be preemptive? Without doubt. But that does not mean that there cannot be a preemptive element in monetary policy. It means only that policy is likely to be less preemptive – and hence more reactive – than it otherwise would be. The critical questions are just how preemptive can and should monetary policy be today? Let me begin to answer this question by defining what I mean by reactive and preemptive policy approaches. First, policy is reactive if it responds only to the incoming data and preemptive if it also responds to a forecast. This is the distinction between backward-looking and forward-looking policy. Second, with respect to inflation, policy can still be preemptive if it responds to incoming data on utilization rates, given the link between current utilization rates and future inflation. Of course, the degree of confidence that policymakers have in this link (and specifically in their measure of excess demand) will determine how preemptive they are prepared to be. The greater uncertainty about the level of excess demand should, I believe, diminish the aggressiveness with which monetary policy responds to changes in utilization rates. The difficulty in forecasting should, in addition, encourage more emphasis on responding to incoming data and diminish (though not eliminate) the role of the forecast in the policy decision. This is sensible and prudent. To sever this relationship of monetary policy either to incoming data on utilization rates or to the forecast altogether, however, would remove the key elements of preemptive monetary policy with respect to containing inflation and leave policy entirely reactive. If the uncertainty were great enough, this would be a reasonable response. But I do not believe that such an extreme position is warranted. There are, however, a couple of constructive policy responses in light of prevailing uncertainties about the level of excess demand and the forecast. First, policymakers could update their estimates of the NAIRU and the output gap (assuming, in the first place, that they find these concepts useful, as I do) in light of realizations of unemployment, output, inflation and other variables. This has, in fact, been one response that many, including myself, have taken in response to recent developments. In following this approach, policymakers would become less responsive to declines in the unemployment rate, to the extent that estimates of the NAIRU are revised downward as unemployment and inflation decline together. Second, policymakers could attenuate the response of the real federal funds rate to declines in the unemployment rate in a region around their estimate of the NAIRU. But once the unemployment rate gets far enough below (or above) the estimated NAIRU so that confidence returns that the labor market is experiencing excess demand (or supply), then the more normal response of real interest rates to incremental declines in the unemployment rate would again become appropriate. But you have to be careful about overdoing caution as well as overdoing aggressiveness. If you take care to adjust your estimate of the NAIRU and the output gap in response to incoming data, you would be unwise to ignore these revised measures of the unemployment and output gaps in setting policy. 2. What is the meaning of symmetric and asymmetric directives? I think that it has become clear that symmetry/asymmetry is a subtle concept. I believe that there are two interrelated dimensions of the so-called tilt or bias in monetary policy. The first dimension relates to the balance of risks going forward. The second relates to the probability of a near-term policy change. The tilt provides the market with an indication of the Committee’s balancing of the risks related to emerging excess demand or supply and inflation going forward and hence in what direction policy is more likely to move. In a symmetric directive, the risks are viewed as evenly balanced, so that the next rate increase could as easily be up or down. In an asymmetric directive, the risks are viewed as tilted in one direction or the other, so there is, for example, a greater likelihood of an increase than a decrease in rates. A symmetric directive also indicates little prospect that a near-term move will be required if the economic outlook evolves roughly as expected. An asymmetric directive, in contrast, alerts the market to the greater possibility, though not the certainty, of a move in a particular direction over some near-term policy horizon. At the December 1998 meeting, the FOMC decided, going forward, to announce the tilt and explain the reason for such a policy change in those cases where the change in the Committee’s views of the balance of risks was “significant” and when announcing this change to the public was viewed as “important.” This made the announcement of the tilt, in effect, another policy tool because an announced change in the tilt would move market rates, though to a lesser degree than a change in the rate itself. This decision should be understood as an effort to increase the transparency of monetary policy and to allow the Committee to communicate more clearly its views of the balance of risks and the prospects of further policy actions going forward. Underlying this effort is the view that financial markets operate more efficiently when they have more complete information and the preference for signaling markets about prospective policy actions rather than surprising markets. The early experience with its use suggests that announced adjustments in the tilt sometimes have unexpectedly large effects on financial markets and on the reaction of markets to subsequent data or statements by FOMC members. It has also become clear that it is not easy to communicate some of the subtleties and complexities of monetary policy intentions in a single word. When the signaling is, as a result, imperfect and the Committee’s intentions are misperceived, market rates may move inconsistently or faster than is justified by the balance of risks and the likely course of policy. The recent experience suggests that the use and announcement of tilts should be viewed as a “work in progress,” rather than a well-tuned and final product. 3. Does the response of the bond market to evolving economic developments reduce or eliminate the importance of activist monetary policy? A theme we sometimes hear is that the FOMC can take a permanent vacation, leaving the conduct of monetary policy to the bond market. I have heard this referred to as the “gyroscope” theory, in that it portrays the bond market as the gyroscope of the economy, sensitively responding to developments so as to stabilize the economy. Were it only so simple! It is sometimes said at the Federal Reserve that when we look at the bond market we are really looking at ourselves in the mirror. This means that market participants are responding to the data, their changing forecast, and their understanding of our policy reaction function. Long-term rates are, after all, based on current and expected future short-term rates. Expected future short-term rates, in turn, are very much a function of the Fed’s policy response. What this means is that when bond rates rise in the expectation of future monetary policy tightening (that is, in the expectation of higher short-term rates), we have a choice. We can confirm the expectations by tightening, preserving the higher bond rates. Or we can contradict those expectations by leaving policy unchanged, likely resulting in some reversal of the initial movement in bond rates (as the bond market comes to better understand our policy reaction function). There is, however, also the possibility that the failure to tighten when such a move is widely expected may leave in place higher long-term rates (or raise them further), if our failure to act is viewed as inconsistent with our commitment to price stability. Such preemptive pricing in the bond market is the private sector analogue to preemptive moves in the federal funds rate. When the bond market is correct, its preemptive pricing should be rewarded by the expected movement in the federal funds rate. To fail to so reward it would be to undermine the ability and willingness of the bond market to engage in such preemptive pricing. On the other hand, when the bond market is viewed as having inappropriately built in expectations of higher rates, monetary policy ought to provide an anchor, in the form of an unchanged funds rate, to which the bond market can return as incorrect expectations are unwound. The bottom line is – no vacation for the Fed. But preemptive pricing in the bond market can make monetary policy more effective by speeding the response of long-term rates to changing economic conditions. There is a potential synergy here between monetary policy and the bond market. The more transparent monetary policy is, the more effective preemptive pricing in the bond market will be. The more effective preemptive pricing in the bond market is, the shorter the lag from a change in monetary policy to the effect on aggregate demand, and hence the more effective monetary policy is. III. Conclusion Sum it all up! First, clarify the logic of the recent Fed tightenings and then provide some insight into the prospects for near-term monetary policy in light of the comments above on the outlook and the strategy of monetary policy. In my view, the current monetary policy problem has two dimensions. The first dimension is to adjust the federal funds rate so that it is appropriate in terms of prevailing utilization and inflation rates, with appropriate regard to the uncertainty about the measurement of utilization rates. I have called this the reassessment issue because it involves a reassessment of the desirability of the full amount of easings implemented last fall. The second dimension involves the adjustment in the funds rate going forward, in response to incoming data and changes in the forecast. The easings last fall were implemented at a time of sharp dislocations in financial markets and sharp downward revisions to the forecasts of both global and U.S. growth this year. I view the recent tightenings as the partial reversal of the earlier easings, in response to the reversal of the factors that motivated them. Financial markets have clearly improved. The global economy looks stronger and the United States is now projected to expand this year at a multiple of the rate projected last fall. In determining how much of the easings to reverse, one also has to take into account the decline in core inflation since last fall, as well as uncertainties about translating the current utilization rates into measures of excess demand. What can and will I say about monetary policy going forward? I always like to point out that I am prepared to be quite explicit about the course of policy going forward. The answer to the question, what policy changes do I expect going forward, is simple: “It depends.” Specifically, it depends on the incoming data and the evolving forecast.
|
board of governors of the federal reserve system
| 1,999 | 9 |
Remarks by Mr Edward W Kelley Jr, Member of the Board of Governors of the US Federal Reserve System, before the Banking Regulators' News Conference on Y2K Readiness at the National Press Club in Washington, D.C. on 16 September 1999.
|
Mr Kelley remarks on Y2K readiness in the United States Remarks by Mr Edward W Kelley Jr, Member of the Board of Governors of the US Federal Reserve System, before the Banking Regulators’ News Conference on Y2K Readiness at the National Press Club in Washington, D.C. on 16 September 1999. * * * I’m appearing here today in my role as Head of the Federal Reserve’s Year 2000 project. As you know, the Fed provides the crucial infrastructure that undergirds the nation’s banking system. It distributes currency and coins and processes checks and electronic payments. It’s vitally important to the nation that this infrastructure functions smoothly through the rollover to the new century. We are confident it will. System readiness By 30 June, 100% of the Fed’s mission-critical systems were certified as Y2K-ready. That is, they were fixed and are now in daily use. This includes all of the systems that support electronic funds transfer, book-entry securities transfer and check payments. Not only are we ready, but the major users of our most critical payments services – depository institutions serving millions of Americans – are ready too. They have tested their systems with our system, using the Year 2000 date. The results of these tests, involving more than 9,000 institutions, have assured us that the nation’s banks will be able to send and receive funds through the Federal Reserve on 1 January and thereafter. Now, I don’t want to imply that this effort has been easy. We estimate we have reviewed and certified 90 million lines of code. As a result of this tremendous amount of very good work, I am convinced that we are as ready for business on 1 January 2000 as we are for business tomorrow. Internationally, a test in June of 34 separate national and international payments systems in 19 countries showed a high level of readiness. More than 500 financial institutions successfully completed simulated Year 2000 transactions on the systems. Liquidity issues In addition to fixing our computer hardware and software, we have taken a number of actions to ensure that both consumers and businesses have access to liquid funds during the rollover. As most of you know, the Fed increased its 1999 currency order to the Treasury Department’s Bureau of Engraving and Printing to make sure we had enough cash in our vaults to meet any additional demand related to the date change. We have stressed that we see no need for the public to hold additional cash. We feel strongly that the most sensible thing to do with your money is to leave it where it is, but our responsibility is to make sure the public knows that currency is readily available. And it is. We finished building up our currency inventory in August, and depository institutions are arranging to have a large amount of extra cash in their vaults as well. Separately, the Board has taken several steps to ensure bank and thrift customers continuous access to funding. First, we advised the industry to make the advance arrangements necessary to borrow at the discount window, and many have already done so. And we created an administrative structure to more easily serve the backup needs of credit unions. We also created a special liquidity facility to provide banks and thrift institutions access to funds at a predetermined price. This borrowing arrangement will be available as from 1 October. Its creation sends a clear message: the Fed will be there to provide liquidity should banks need it. That, in turn, will enable banks to offer their customers similar assurance. Earlier this month, the Federal Reserve Bank of New York announced several changes to its daily open-market procedures intended to ensure the smooth operation of financial markets around the century rollover. All of these actions have a common theme. They are meant to build confidence that the Fed is determined to play its traditional role as a provider of liquidity in uncertain times. Summary In sum, our systems are ready. We are working to build public confidence. We will be on the job constantly during the rollover period and we will be prepared for any problems that may arise. No one can guarantee the complete absence of isolated glitches, but Americans should feel confident the Federal Reserve and our depository institutions have done everything that can be done to make the transition to the Year 2000 as smooth as possible.
|
board of governors of the federal reserve system
| 1,999 | 9 |
Remarks by Mr Edward M Gramlich, Member of the Board of Governors of the US Federal Reserve System, before the Electronic Payment Symposium, University of Michigan, Ann Arbor on 17 September 1999.
|
Mr Gramlich assesses the state of the electronic money transformation in the United States and abroad Remarks by Mr Edward M Gramlich, Member of the Board of Governors of the US Federal Reserve System, before the Electronic Payment Symposium, University of Michigan, Ann Arbor on 17 September 1999. * * * Although the United States economy seems to be leading the world in the adoption of new computer and internet-based technology, the picture is not uniform. The United States is not at the forefront in the adoption of electronic money systems, one area that would seem most eligible for the information revolution. Adherence to traditional payment systems, check and cash, is very strong. The United States is the only developed country in the world where check use is still increasing, with the number of checks written growing at a rate nearly as fast as the overall economy. Use of cash is extensive as well. Americans still use cash for about three-quarters of all transactions. The total US supply of coin and currency now comes to $550 billion, about one-third of which is actually circulating in this country (the remainder is held abroad). But even after subtracting estimated foreign balances, the supply of outstanding coin and currency comes to $670 per capita which strikes most people as incredibly large. For many years, observers have looked forward to the advent of electronic money, a system that uses either a computer chip or another electronic device to record payments and debits automatically. There are obvious efficiency advantages in terms of ease of handling and record-keeping for consumers, merchants, the banking system and the Federal Reserve. Use of electronic money systems appears to be growing in at least a few foreign countries. But in this country, growth of electronic money systems is sluggish – well behind earlier predictions, well behind the growth of credit and debit card use and way behind the growth of other types of electronic commerce. Even fervent advocates of electronic money will admit disappointment at its rate of adoption. In this talk I will try to assess the state of the electronic money transformation, here and abroad. I mention some promises, some stumbling blocks and some technological and regulatory issues that will have to be dealt with as electronic money use proceeds. Historical antecedents The idea of using technology to improve the efficiency of the payment system is very old. World commerce has seen an evolution from barter to valuable metals, to paper money and to checks. In the United States, as early as 1853, more than a half-century before the Federal Reserve System came into existence, private banks in the New York area formed an exchange and settlement system that economized on the movement of paper and settlement balances. In 1970, some of these arrangements were automated and became the basis for a wire transfer system called the Clearing House Interbank Payments System (CHIPS). CHIPS is still the leading clearing and settlement system for large dollar interbank payments (average size of $6 million) originating in international trade and finance. When the Federal Reserve System came on the scene in 1913, one of its early tasks was to eliminate gold transfers and exchange rate differentials between the dollar and gold across the Federal Reserve Districts. The Fed created a “goldwire” system that involved inter-District settlements through the Gold Settlement Fund. This system quickly gave way to a telegraphic system for adjusting reserve balances. This system in turn evolved into a highly automated and centralized Fedwire system to handle large volumes of high-value payments settled in reserve balances on a real-time basis. The Fedwire system is the key interbank settlement system for the federal funds market. Average payments are around $3 million, but the system is also used for much smaller time-critical payments. There is also a system for handling smaller retail payments, now averaging about $3.000. In the early 1970s, a group of California commercial banks, in cooperation with the Federal Reserve Bank of San Francisco, organized an automated clearing system operating within the San Francisco District. A similar development occurred in the Atlanta District, with groups in other Districts studying similar systems. These efforts led to the creation of an Automated Clearing House (ACH) throughout the entire Federal Reserve System. The federal government along with some other large private employers began making some payroll deposits through this system in 1975. I vaguely remember the University of Michigan switching to automatic payroll deposit at about that time, apparently one of the early large employers to switch. Use of the ACH has grown and is still growing rapidly, with the number of payments made through ACH rising at double-digit annual rates right up to the present time (in contrast to the sluggish growth of electronic money use). But there is still vast potential for further ACH growth because only about 45% of payroll payments are now made through the ACH and only about 8% of consumer bill payments. Since the marginal cost to consumers of making payments by ACH is about half that of the cost of making payments by check, in the end the most interesting question may be why the ACH has not displaced the costlier check system even more rapidly. Electronic money With that background, we now turn to our main topic, electronic money. This term is normally taken to refer to a stored-value product, where a prepaid balance of funds is recorded on a card or personal computer controlled by the consumer and updated automatically as payments are made in or out. The stored-value balance would be recorded as a liability of the institution, financial or otherwise, that issues the card. Early stored-value cards recorded the account balance on magnetic strip, but magnetic strip cards can be difficult to reload and are easy to tamper with. Future smart cards are likely to record stored-value through an embedded microprocessor chip which permits sophisticated encryption to protect against counterfeiting. These cards can also be used as credit cards, debit cards or repositories of other identifying information for the consumer. While much rarer, electronic money systems can also be computer-based. Under these systems, variously called e-cash, cyber coins and cyberbucks, computer software generates electronic (virtual) tokens that serve as cash. The seller has to verify the tokens and the issuer may have to settle them – operations that at this point have proven costly and difficult. Stored-value products can be used in open or closed systems. Closed systems involve a narrowly defined group of consumers such as riders of a metropolitan transportation system or students at the University of Michigan. The infrastructure necessary to redeem the value is in place either at transportation system terminals or owned by merchants within a relatively small geographic area. Open systems involve many consumers and merchants over an extended geographic area, with the stored-value cards really functioning as electronic money. One way to assess the potential of electronic money is to analyze the benefits and costs of electronic money as compared with alternative ways of making payments. I will discuss this separately for consumers, merchants, financial institutions and the Federal Reserve System. While the potential gains to all groups together may not be huge, they do seem to be positive, adding to the puzzle about the slowness of adoption of stored-value technology. From the consumer’s standpoint, there could be many potential advantages of electronic money, but there are also risks. Stored-value cards are easier to handle than either cash or checks. Although most electronic money systems now do not have pin number protection, that protection is technologically possible, and protection against loss through theft could be developed in the future. There is some risk of insolvency by the issuer, but at least for financial institutions that risk is very low statistically. A greater problem is that until the stored-value system becomes universal, there is a risk that the sellers will not accept the card. Payment systems involve a network – money is not truly money unless it becomes nearly universally acceptable – and network problems have been a big impediment to the development of any currency system. This is one reason that electronic money systems may be more likely to develop in natural closed systems or in small countries. One apparent cost of electronic money for consumers, at least in comparison with credit cards, is that of float. In benefit-cost parlance, float is an internal redistribution – a cost of electronic money felt by consumers offset by gains to other sectors. Float then should not affect the overall social interest in electronic money. Moreover, in the long run we might expect the competitive market to offer electronic money products that compensate consumers for their loss of float. Electronic money systems might raise a potential trade-off between protection and privacy. Consumer protection could be enhanced by stored-value systems because there could be a record of all transactions, and this record could be used to resolve disputes or to deal with losses due to theft, in much the same way that consumers’ credit cards are now protected. At the same time, this record could also be misused to defraud consumers or infringe on their financial privacy, by irresponsible action on the part of the issuing institution or in some other way. Consumers could sacrifice their protection but retain their privacy by purchasing cards without this extensive record-keeping or by signing “opt out” forms to prevent the disclosure of confidential information. Consumers could also limit invasions of privacy by keeping low balances on their card or restricting its use, but this limits the utility of electronic money in the first place. From the standpoint of merchants, electronic money systems raise orthodox benefit-cost questions. It is costly for merchants to invest in the infrastructure to process the stored-value cards, much more so than for credit cards because of the encryption technology or the difficulty of verifying electronic tokens. But once merchants have made the investment, costs of handling at least some types of payments should go down. Compared with credit and debit cards, there may be little cost reduction. Compared with checks, there is less risk of default because the sale will not be completed unless the stored value is adequate to make the purchase and the amount of the stored value can be determined immediately. Compared with cash, there is less risk of theft, at least out of the traditional cash register. While merchants have not eagerly invested in the infrastructure because of the network problems mentioned above, in the long run there should be some gains to them from electronic money technology. For financial institutions, stored-value products may offer new opportunities for delivering banking services and improving security through computer chips. Consumers may switch from using cash to using deposits that are linked to stored-value products, say with automatic loading from an ATM machine or with links to a personal computer. This could be a profitable new market for financial institutions with relatively little risk. While financial institutions have not been early to climb aboard the electronic money bandwagon either, they may become strong supporters at some point. If financial institutions issue the stored-value cards, there is some protection against abuses because financial institutions are already regulated, particularly with respect to their solvency. Regulations could also encompass the institutions’ disclosure policy, its privacy policy and its following of other proper procedures. Many institutions are now announcing very strict disclosure and privacy policies and extending these policies to stored-value products should be straightforward. In one early indication of coming complications, the Federal Deposit Insurance Corporation (FDIC) has ruled that electronic money accounts issued by insured institutions are not always legally defined as deposits under the deposit insurance act. This means that stored-value balances do not always carry deposit insurance. By contrast, in virtually all European countries, some of which seem to have more rapid growth of electronic money products, stored-value balances are typically accorded the same protection as deposits issued by a bank. At this point it is unclear whether the negative FDIC decision could be a major reason for the sluggish growth of stored-value products in this country. In a statistical sense, risk of loss of value through financial insolvency is minor and is probably much less of a factor to consumers than risk of loss of the card or risk that merchants will not accept the card. Finally, governmental institutions could have a stake in stored-value technology. The Debt Collection Improvement Act of 1996 encourages the federal government to make most of its payments electronically. When consumers do not have bank accounts, as about 10% now do not, governments have at this point made most payments through debit cards. But at some point, stored-value products might become an easy way for governments to pay social security benefits, unemployment insurance benefits and even welfare and food stamps. Innovations are often criticized because they leave low and moderate income groups behind, but this is an innovation that might actually benefit low and moderate income groups disproportionately. Returning to the protection-privacy trade-off, government crime control agencies may find stored-valued transactions easier to trace than cash transactions. But as said above, to the extent that the government may gain records of transactions, consumers will lose a corresponding degree of privacy. When this issue has come up in the past, the desire for customer privacy has clearly won out, and there is no obvious reason that the balance of political weight will shift in the future. While the potential is there for increased crime control, this potential seems unlikely to be realized. Federal regulators have also had to face the issue of whether to regulate stored-value products. On the one hand, when networks are important, there is something to be said for having the government define a technology early to provide a standard for future development. As mentioned earlier, the role of the Federal Reserve was critical in the past development of some of our major payment systems. On the other hand, the existence of an already well-functioning payment system is an argument for a more passive governmental regulatory posture. Presumably now that the main institutions are in place, payment systems can develop in ways that effectively meet consumer desires. One can imagine network externalities, but it is not obvious that the natural development of payment systems in an environment with minimal regulation will lead to serious inefficiencies. For what it is worth, federal regulators have taken the passive posture and there is at this time little federal regulation of stored-value products. Summing all of this up, it seems that all groups – consumers, merchants, financial institutions and governments – can realize potential gains from electronic money. The gains will be greater, the easier it is to establish natural networks or closed systems. Net gains will be correspondingly less, the more convenient alternative payment systems are. Foreign experience with electronic money Moving from the general to the specific, I now review the actual experience with electronic money in other developed countries. As I said earlier, adoption of electronic money products is proceeding somewhat more rapidly in some foreign countries than at home, and it makes sense to examine the reasons. Electronic money pilot programs are at least in the development stage throughout much of the world. A recent survey found that fifty countries had at least some electronic money pilot projects. The vast majority of the projects involved stored-value card systems, with a handful of countries having at least small-scale experimental network systems. Seven countries appear to be making the most use of electronic money products. Switzerland, the Netherlands and Hong Kong have numbers of outstanding cards exceeding 75% of their population, though in Hong Kong a great many of the cards are for transportation purposes. Germany, Singapore, Belgium and Austria have numbers of outstanding cards exceeding 40%, though many of these are linked to debit cards. By contrast, the US number is a fraction of 1%. But even in countries where card use is widespread, the cards are used for very small transactions and account for a small share of overall payments. Many countries have adapted their commercial codes to electronic money products. They have protocols for dealing with fraud, loss, theft, other disputes and privacy disclosure requirements. Many of these countries have anti-money-laundering provisions and unlike the United States, most have ruled that deposit insurance does cover electronic money products. While it is difficult to generalize, the countries where stored-value products seem to have made the biggest inroads are small, implying that electronic money systems work better in a natural closed system or in isolated areas. They also work better in more developed countries where wages and personnel costs to handle less automated payment systems are higher. The small average size of stored-value transactions suggests that stored-value cards will substitute for currency, if they substitute for anything. In that light, the average ratio of currency holding to GDP in the seven countries where stored-value cards are common is 6.8%, higher than the average ratio of 6% for all developed countries. It is also much higher than the ratio for the United States. Nominally, the US ratio is 5.2%, but when we consider that two-thirds of US currency is held abroad, the effective US ratio is less than 2%. This comparison implies that stored-value products have their best chance of success in countries where use of currency is widespread. Even though the United States seems to have a lot of domestic currency outstanding, currency usage is even higher in these other countries. Domestic experience In this country there are a number of closed electronic money systems, but still relatively few open systems. In Atlanta, more than 1.7 million electronic money cards were produced for the 1996 Olympic Games, and these cards paid for about 200,000 transactions totalling $1.1 million. The Olympics are over and this system has now been discontinued. Two separate pilot programs, involving 100,000 customers and 1,200 merchants, began in New York City, but were discontinued as of December 1998, largely because of sluggish acceptance by consumers and merchants. Nationally, the firm Digicash tried a network-based system, but acceptance was again sluggish and the system was discontinued. Cyber Cash and Wells Wallet are now trying other network-based systems. There is a growing amount of commerce on the Internet, but at this point only a minuscule amount takes place through stored-value products. Most takes place through credit cards, not electronic money systems. It may be easier and cheaper for consumers to make Internet transactions directly through stored-value routines or it may not. Given the slight cost differences between Internet credit card transactions and Internet stored-value transactions, it may be realistic to look towards a continuation of the slow development of stored-value Internet transactions. Stored-value transactions would seem to offer the biggest cost savings relative to cash, not existing Internet transactions. Conclusions This brief survey of electronic money suggests some tentative conclusions. Stored-value products could have a promising future – they could become easier or cheaper for every potential stakeholder – consumers, merchants, financial institutions and governments. At the same time, there have been continuing innovations in the payment system in this country, and part of the reason for the slow adoption of electronic money products here is surely that innovations in credit cards, debit cards and the ACH have made it much easier and safer to conduct transactions through these systems. Storedvalue products have done better in small countries and in countries where there is greater use of coin and currency. Looking ahead, the United States may be one of the countries least likely to adopt open stored-value systems. It is a big country with many alternative ways of making payments. Americans are strongly attached to checks and use of credit cards, debit cards and the ACH is growing rapidly. Even cash is still widely used for small transactions, though less so than in other countries. A major hurdle for the electronic money products is the network problem. If most merchants do not accept stored-value products, most consumers will not bother with stored-value cards. This is the classic chicken-egg problem, a problem that often gets solved by slow growth on both sides of the market. In principle, the government could intervene and force or induce merchants, consumers and financial institutions to adopt the new technology, but in practice, alternatives to stored-value products are cheap and safe enough that such intervention is both economically unwise and politically unlikely. It is also possible that this network consideration leads to the development of many closed electronic money systems – for transportation, for university students, for one-employer towns and the like. This is perhaps not the visionary’s dream, but even the development of closed systems cuts the demand for cash and checks and makes the payment system more efficient.
|
board of governors of the federal reserve system
| 1,999 | 9 |
Remarks by Mr Robert W Ferguson Jr, Member of the Board of Governors of the US Federal Reserve System, before the 2000 Global Economic and Investment Outlook Conference at the Carnegie Bosch Institute, Pittsburgh on 21 September 1999.
|
Mr Ferguson asks whether information technology is the key to higher productivity growth in the United States and abroad Remarks by Mr Robert W Ferguson Jr, Member of the Board of Governors of the US Federal Reserve System, before the 2000 Global Economic and Investment Outlook Conference at the Carnegie Bosch Institute, Pittsburgh on 21 September 1999. * * * The last few years have seen an explosion in the uses of information technology throughout the American economy. At the same time, trend US productivity growth appears to have risen to its highest rate since the 1970s. Casual empiricism would suggest a connection – that the enormous investment in computer technology that has been going on for at least twenty years has finally started to bear fruit. But although information technology is available, at least in theory, to the whole world, the recent surge in productivity growth appears to have been stronger in the United States than elsewhere, including the other industrialized countries. This raises the interesting questions of what else besides simple availability is needed to translate the promise of information technology into real productivity gains and whether – whatever it is – the United States has more of it? These questions are impossible to answer with precision, so the purpose of my talk this afternoon is to identify particular features of the American economy that might contribute to an especially hospitable climate for translating the potential gains from information technology into actual productivity growth. I also want to be careful not to suggest that gains from the use of information technology can safely be assumed to go on indefinitely at their recent pace. The recent technical advances represent a continuation of a long string of fundamental leaps in technology that have worked their way through the economic system over many years, boosting productivity growth in the process, but how long a particular innovation has a beneficial effect on productivity growth is difficult to say. That depends on the rate of investment in the equipment that embodies the new technology, the rate at which the labor force is able to acquire needed skills, and, of course, on the fundamental potential of the technology itself. History demonstrates that the boost to productivity growth from a particular technological advance is not unlimited and eventually will be fully exploited. Just as “trees do not grow to the sky”, so, too, increases in the rate of productivity growth from any given advance are not without limits. One should be cautious in extrapolating from past trends. What we do know What we do know is that the use of information technology, at least in the United States, has been growing by leaps and bounds. For example, personal and business Internet sites have proliferated at an astounding rate and a wide variety of tasks that used to require person-to-person contact, such as making airline and other travel reservations and choosing and ordering merchandise, can now be done via the Web. Search engines have cut the time needed to track down a person or an item to a fraction of the amount previously required, clearly raising productivity. Improved communication and information flow are only part of the story; increases in computing power also allow workers to complete a variety of tasks more quickly. Although it sometimes feels as if these changes are taking place at lightning speed, most of us know that this is partly an illusion. Much of the basic technology has been around for decades. In fact, as productivity growth slowed in the 1970s and continued to languish in the 1980s, many observers wondered whether the supposed benefits of cheaper and more powerful computers would ever be realized. This highlights an important facet of the innovation process: the benefits of a new technology are in no sense automatically conferred on the economy, but will show up only after the technology is widely adopted, capital facilities are refitted and adapted to it and workers learn to use it. For instance, Paul David, a Stanford University economist, notes that electric motors did not boost productivity growth appreciably until more than forty years after Edison installed the first dynamo in 1881. Though a new technology typically will not be fully incorporated overnight, the speed of its adoption can be faster or slower depending on the institutional and other features of the economy. For information technology, the process of incorporation appears to be taking place at a considerably faster rate in the United States than in other parts of the industrialized world. For example, there are more than 23 million Internet hosts in the United States – roughly one for every 11 people. Canada is second among the major industrial economies with 1.6 million, one for every 19 people. In contrast, the ratios are about 1 to 128 people in France and 1 to 174 people in Italy. These statistics suggest that technology is being used more widely in the United States, but is its use paying off? The answer appears to be “Yes.” One pertinent piece of evidence is a recent study of costs of managing cash flow in American versus European firms. The study showed European costs to be roughly 30% above those of US firms, largely because of the slow adoption of information and other computer-based technologies. This is pertinent because finance operations of this type are an essential activity in every firm. If it is the case that the United States, at least at this juncture, is somewhat ahead of the rest of the world in realizing the benefits of information technology, does this indicate that our economy or society possess certain characteristics that are particularly conducive to rapid diffusion of technical change? A recent study of innovation by the Agamus Consulting firm provides some interesting insights. The United States placed second to the Netherlands out of thirteen countries in a survey that asked companies to rate the “innovation climate” in their home countries. A second and possibly more objective ranking, based on a measure of innovative success developed by Agamus, placed the United States first. The most important factor cited as conducive to innovation was the overall educational standard. Given various cross-country comparisons of educational systems, I assume that this finding must be based on our broad-based attainment of higher education. Additional possible relevant factors I would like to suggest several other factors that might also make some difference. This is not meant to be either a comprehensive or a definitive list. Instead, it is an attempt to advance some hypotheses that might help to explain the recent dynamism of the American economy compared with some of its major trading partners – in particular its link to technological change – and to see to what extent the evidence supports them. The particular features that I would like to discuss (not necessarily in order of importance) are corporate governance and, especially, a focus on maximization of shareholder value as opposed to other objectives; flexibility of labor markets and the willingness to accept high rates of labor turnover; willingness on the part of labor to continue to invest in human capital over a lifetime; the regulatory environment; and the friendliness of the institutional environment to entrepreneurship. Corporate governance Clearly, the effectiveness of the system of corporate governance is important in overall corporate performance. Adoption of new technology may require considerable effort and short-term expense, and it is important that managers have the appropriate incentives to search for improvements that reduce costs over the longer-term. In recent years, aggressive cost-cutting in the United States has been linked to greater emphasis on maximization of shareholder value and less on growth and diversification which was more prominent in the 1970s and 1980s. This shift in perspective appears to have been driven, at least in part, by the increasing dominance of large institutional investors in US financial markets. In the past, maximizing shareholder value had not been as widely embraced abroad although there are indications that views are changing. For instance, members of corporate boards in Japanese companies often have been promoted from within, fostering control by allied industrial concerns, family interests, banks and holding companies which may be more motivated by concerns other than maximizing shareholder value. In Europe, a reason sometimes cited for the delay in the adoption of shareholder value as a motivating factor for corporations is the greater involvement by the public sector in the economy, with a strong emphasis on job preservation. Another important incentive for managers to maximize shareholder value is pay-for-performance through avenues such as stock options. Although now commonplace in the United States, such instruments were not legal in Germany and Finland until 1998. However, pressure for change clearly has started to emerge. Financial market liberalization has increased the importance of equity and publicly traded debt as sources of finance. Anecdotal reports suggest that the concept of maximization of shareholder value has been gaining greater acceptance as firms turn more to stock and bond markets for financing and as governments, particularly in Asia, have increased disclosure requirements. If the trend continues and other countries move further towards the US model, it will be interesting to see whether improvements in productivity growth follow. Labor market flexibility Another factor that is often cited as a major element in the dynamism of the US economy is the extraordinary flexibility of our labor markets, especially in contrast to those of continental European countries. Although much European regulation has been directed at saving jobs, it can be argued, in fact, to have had the opposite effect on the aggregate, as evidenced by the marked upward drift in continental European unemployment rates over the past two to three decades. These jobless rates are now much higher than those in the United States and the United Kingdom which has also undergone a period of substantial labor market deregulation. It is also noteworthy that the unemployment rate differential is particularly large in younger age groups; youth unemployment rates were around 30% in 1997 in France and Italy, for example. To the extent that younger people are likely to have had more exposure to information technology in the educational process, this bias by itself could potentially be an important obstacle to the incorporation of technology into business processes. In addition, workers who are unemployed for long periods of time are likely to see their technology skills deteriorate. But why might regulations designed to protect jobs have such a perverse effect? The evidence suggests that new technology often results in more growth in employment in innovating industries. However, it also tends to shift demand from unskilled to more highly skilled workers, potentially displacing unskilled workers in the process. Job protection regulations that affect a firm’s flexibility to recruit and dismiss workers can interfere with this process, making it difficult both for newcomers to find jobs and for firms to adopt new technologies. The inability to adjust hours flexibly through the use of overtime, part-time and temporary work may also stifle innovation. According to the European Car Assembly Association, similar research projects take much longer to complete in Germany than in the United States because of shorter working hours and less flexible working conditions. As a result, they argue that German automobile manufacturers are less able to exploit the shorter product life cycles associated with more fashionable and high-tech cars. It is axiomatic that in a truly flexible labor market everyone who wants a job can find one – at some price. As technical change increases demand for skilled relative to unskilled labor, the unskilled workers must acquire new skills, find new jobs at lower relative wages or become unemployed. OECD data suggest that the United States and Canada have been more successful than the other industrialized countries in achieving these adjustments and, therefore, in maintaining aggregate employment growth on a par with labor force growth in the face of differential rates of job growth by occupations. Whitecollar, high-skilled employment increased at a much faster rate than employment in the other categories in nearly all cases in the G7 countries over 1979-95. In the United States and Canada, white-collar, low-skilled employment also rose at healthy rates while blue-collar employment was little changed. In contrast, blue-collar employment fell sharply in most of the other countries. Human capital Of the three choices facing an unskilled worker in a fast-changing economy, acquiring new skills – that is, increasing one’s human capital – would seem, in general, to be preferable to either taking a pay cut or becoming unemployed. To what extent do American workers take advantage of such opportunities relative to the rest of the world? Here the evidence is somewhat mixed. The US adult population has the highest rates of completion of upper secondary or higher education of any of the major industrialized countries. However, educational attainment rates are only part of the story as skills need to be continually upgraded in a world of rapid technical change. This does not suggest that educational attainment is unimportant; in fact, there is a clear interaction between educational attainment and continuing education as more-highly-educated people are also more likely to participate in continuing education. Nevertheless, it is hard to make the case that the United States is ahead of other countries in terms of participation in continuing education. In an OECD study of the role of continuing education and employability, the rate of participation in these programs in the United States was about average for the six countries in the sample. However, one noteworthy result was that rates of participation in training programs were below average among the young but above average among older workers in the United States. This suggests that American workers tend to keep improving existing skills or acquiring new ones as they age to a greater extent than do their counterparts in other countries. Other business regulations In addition to job protection legislation, other forms of business regulation may also have an impact on the climate for innovation. A 1994 survey of more than 2000 European companies by the Union of Industrial and Employers’ Confederations of Europe found that regulations made it more difficult to minimize costs, organize production in a flexible way, reduce time to market and reduce uncertainty. The incidence of product market regulation is lower in the United States than in continental Europe. A cross-country comparison of macroeconomic performance in terms of productivity growth and utilization of resources with the OECD’s index of the overall regulatory environment suggests that a country’s performance does improve as the regulatory environment becomes less restrictive. On a micro level, differences in the regulatory regimes of the biotechnology industry in Europe and the United States have been cited as playing an important role in explaining why US firms are ahead of European firms in important measures of innovation such as R&D expenditures and patents. Surveys of the European biotechnology industry suggest that regulatory restrictions tend to push product development towards existing technologies and force firms to conduct research abroad, although I should note that it is also claimed that American pharmaceutical companies are conducting an increasing amount of research abroad as well because of regulatory obstacles at home. In addition, regulatory regimes that promote competition foster innovation and diffusion of technologies. According to an OECD study, the United States has policies that are effective in preventing anti-competitive behavior, but Germany is not far behind. Why does a more competitive environment foster innovation? One hypothesis is that competition forces firms to innovate and adopt new technologies and, therefore, it increases the speed of diffusion of technology. In contrast, monopolists may have little incentive to innovate because they already control most of the market. Competition will also tend to result in the failure of unproductive businesses and facilitate the entry and success of more innovative ones. In addition, more heavily-regulated firms may be less motivated to choose an efficient technology. In recent years, industries such as telecommunications, transportation, electricity and banking have undergone privatization, deregulation and increased competition in a number of countries. In many cases, these reforms were in fact prompted by technological change which reduced large fixed costs and thus the scope for natural monopolies. Furthermore, in some of these industries, there is evidence that the move towards a more liberalized regulatory regime induced further innovation. A good example is the telecommunications industry. Evidence on patents (one measure of innovation) and measures of productivity suggests that those countries that have extensively liberalized (such as Japan, the United Kingdom, Finland and the United States) have experienced greater innovation and larger gains in efficiency. Evidence from the telecommunications industry also suggests that the technological diffusion rate is faster under a more competitive regulatory regime. For instance, both growth in cellular phone usage and the penetration rate for Internet hosts is much higher in more competitive market structures. This is not to suggest that regulation is necessarily a bad thing. Regulations that protect intellectual property rights reward those with creative ideas and therefore can act to stimulate cost-reducing innovations. From a broader perspective, productivity growth is obviously not society’s only priority – worker health and safety, pollution control and other societal values are important as well. Although it has been argued that regulations requiring mandated approaches to pollution reduction or worker safety tend to divert managerial energies from pursuing cost-reducing innovations, studies have shown that some regulatory changes can in fact enhance productivity by forcing a firm to develop new and more-efficient production techniques. For example, the cotton dust standard mandated by OSHA is claimed to have led to the adoption of new and more cost-effective technologies utilized by the textile industry. What I think this suggests is the need, as with so much in economics, to recognize trade-offs. We should recognize the broad range of society’s interests and continue to seek balance by striving for regulation that serves well-defined purposes with minimal burden. Other institutional features Other institutional features are also important to the climate for innovation. For instance, entrepreneurship is fostered by access of small firms to capital markets. A lack of breadth and depth of financial institutions and markets can inhibit the financing of innovative projects by small firms. Again, the United States appears to have an advantage in this regard relative to Europe and Japan. In particular, venture capital markets here are both more developed and more geared to financing higher-risk projects, mainly in technology-based sectors by start-ups with prospects of rapid growth. Furthermore, the range of investors is wide and includes pension funds, insurance companies and even private individuals. In contrast, in Europe, venture capital is geared towards more mainstream projects and banks dominate lending. In Japan, a venture capitalist is typically a subsidiary of a large financial institution and invests mainly in established firms. Conclusion Obviously, the United States does not have a monopoly on technological advance. We should not be smug nor complacent because certainly the US experience will be adopted and adapted by other countries. Although the United States arguably has led the way into the information technology revolution, there is evidence that others are following. Scandinavia in particular appears to be embracing computer-based technology; Sweden has begun to market itself as Europe’s “Silicon Valley.” Adoption of new technologies in the United States may also have been spurred in recent years by the cyclical strength of the economy in combination with strong domestic and international competitive pressures. With new workers increasingly difficult to hire in a tight labor market, firms have an increased incentive to find new and more-efficient ways to use existing labor resources. I might add that the current low inflation environment also helps this process. In the presence of subdued inflation expectations, the first inclination of firms in the face of rising demand for their output, thus far at least, appears not to have been to raise prices but rather to find ways to expand output via more efficient means of production. It is clear that other countries, many of which are less far along in reaping the benefits associated with the revolution in information technology, have the potential to gain more over the period ahead. The extent to which they do realize these gains will depend on how successful they are in adapting to their unique circumstances policies that foster efficiency and competition in labor and product markets. I wish them well.
|
board of governors of the federal reserve system
| 1,999 | 9 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the World Bank Group and the International Monetary Fund, Program of Seminars, Washington, D.C., on 27 September 1999.
|
Mr Greenspan draws lessons from the global crises of 1997 and 1998 Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the World Bank Group and the International Monetary Fund, Program of Seminars, Washington, D.C., on 27 September 1999. * * * With the benefit of hindsight, we have been endeavoring for nearly two years to distill the critical lessons from the global crises of 1997 and 1998. From what seemed at the time to be isolated and contained disruptions in Thailand and Indonesia, economic turmoil deepened and spread, ultimately engulfing the emerging-market economies of East Asia and other parts of the globe and then the financial markets of industrial countries. The failure of normal adjustment processes to contain the financial turmoil made this crisis longer and deeper than any of us had expected in its early days. One possible clue to this breach may reside not in the events leading up to the East Asian crisis in the spring of 1997, but rather in the extraordinary episode of financial market seizure that afflicted some emerging-market and industrial-country markets, particularly in the United States, a year ago. Following the Russian default of August 1998, public capital markets in the United States virtually seized up. For a time not even investment-grade bond issuers could find reasonable takers. While Federal Reserve easing shortly thereafter doubtless was a factor, it is not credible that this move was the whole explanation of the dramatic restoration of most, though not all, markets in a matter of weeks. The seizure appeared too deep seated to be readily unwound solely by a cumulative 75 basis point ease in overnight rates. Arguably, at least as relevant was the existence of backup financial institutions, especially commercial banks, that replaced the intermediation function of the public capital markets. As public debt issuance fell, commercial bank lending accelerated, effectively filling in some of the funding gap. Even though bankers also moved significantly to risk aversion, previously committed lines of credit, in conjunction with Federal Reserve ease, were an adequate backstop to business financing. With the process of credit creation able to continue, the impact on the real economy of the capital market turmoil was blunted. Firms were able to sustain production, business and consumer confidence were not threatened, and a vicious circle – an initial disruption in financial markets leading to losses and bankruptcies among their borrowers and thus further erosion in the financial sector – never got established. What we perceived in the United States in 1998 may be an important general principle: multiple alternatives to transform an economy’s savings into capital investment offer a set of backup facilities should the primary form of intermediation fail. In 1998 in the United States, banking replaced the capital markets. Far more often it has been the other way around, as it was most recently in the United States a decade ago. Highly leveraged institutions, such as banks, are, by their nature, periodically subject to seizing up as difficulties in funding leverage inevitably arise. The classic problem of bank risk management is to achieve an always elusive degree of leverage that creates an adequate return on equity without threatening default. The success rate has never approached 100%, except where banks are credibly guaranteed, usually by their governments, in the currency of their liabilities. But even that exception is by no means ironclad, especially when that currency is foreign. When American banks seized up in 1990, as a consequence of a collapse in the value of real estate collateral, the capital markets, largely unaffected by the decline in values, were able to substitute for the loss of bank financial intermediation. Interestingly, the then recently developed mortgage-backed securities market kept residential mortgage credit flowing, which in prior years would have contracted sharply. Arguably, without the capital market backing, the mild recession of 1991 could have been far more severe. Similarly Sweden, like the United States, has a corporate sector with a variety of non-banking funding sources. Bank loans in Sweden in the early 1990s were concentrated in the real estate sector, and when real estate prices also collapsed there, a massive government bailout of the banking sector was initiated. The Swedish corporate sector, however, rebounded relatively quickly. Its diversity in funding sources may have played an important role in this speedy recovery, although the rapidity and vigor with which Swedish authorities addressed the banking sector’s problems undoubtedly was a contributing factor. The speed with which the Swedish financial system overcame the crisis offers a stark contrast with the long-lasting problems of Japan, whose financial system is the archetype of virtually bank-only financial intermediation. The keiretsu conglomerate system, as you know, centers on a “main bank,” leaving corporations especially dependent on banks for credit. Thus, one consequence of Japan’s banking crisis has been a protracted credit crunch. Some Japanese corporations did go to the markets to pick up the slack. Domestic corporate bonds outstanding have more than doubled over the decade while total bank loans have been almost flat. Nonetheless, banks are such a dominant source of funding in Japan that this non-bank lending increase has not been sufficient to avert a credit crunch. The Japanese government has intervened in the economy and is injecting funds in order to recapitalize the banking system. While it has made some important efforts, it has yet to make significant progress in diversifying the financial system – which arguably could be a key element, although not the only one, in promoting long-term recovery. Japan’s banking crisis is also ultimately likely to be much more expensive to resolve than the American and Swedish crises, again providing prima facie evidence that financial diversity helps limit the effect of economic shocks. This leads one to wonder how severe East Asia’s problems would have been during the past eighteen months had those economies not relied so heavily on banks as their means of financial intermediation. One can readily understand that the purchase of unhedged short-term dollar liabilities to be invested in Thai baht domestic loans (counting on the dollar exchange rate to hold) would at some point trigger a halt in lending by Thailand’s banks. But why did the economy need to collapse with it? Had a functioning capital market existed, the outcome might well have been far more benign. Before the crisis broke, there was little reason to question the three decades of phenomenally solid East Asian economic growth, largely financed through the banking system, so long as the rapidly expanding economies and bank credit kept the ratio of non-performing loans to total bank assets low. The failure to have backup forms of intermediation was of little consequence. The lack of a spare tyre is of no concern if you do not get a flat. East Asia had no spare tyres. The United States did in 1990 and again in 1998. Banks, being highly leveraged institutions, have, throughout their history, periodically fallen into crisis. Where there was no backup, they pulled their economies down with them. One can wonder whether in nineteenth century United States, when banks were also virtually the sole intermediary, the numerous banking crises would have periodically disabled our economy as they did, had alternate means of intermediation been available. In dire circumstances, modern central banks have provided liquidity, but fear is not always assuaged by cash. Even with increased liquidity, banks do not lend in unstable periods. The Japanese banking system today is an example: the Bank of Japan has created massive liquidity, yet bank lending has responded little. With very large non-performing loans of indeterminate value, the size of capital in Japanese banks is difficult to judge. The periodic eruption of the so-called Japanese funding premium in recent years attests to the broad degree of uncertainty of the viability of individual banks. This understandably creates considerable caution on the part of Japanese bank loan officers in committing scarce bank capital. But unlike the United States and Sweden a decade ago, alternate sources of finance are not yet readily available. The Swedish case, in contrast to America’s savings and loan crisis of the 1980s and Japan’s current banking crisis, also illustrates another factor that often comes into play with banking sector problems: speedy resolution is good, whereas delay can significantly increase the fiscal and economic costs of a crisis. Resolving a banking-sector crisis often involves government outlays because of implicit or explicit government safety net guarantees for banks. Accordingly, the political difficulty in raising taxpayer funds has often encouraged governments to procrastinate and delay resolution, as we saw during our savings and loan crisis. Delay, of course, can add to the fiscal costs and prolong a credit crunch. The annals of the United States and others over the past several decades tell us that alternatives within an economy for the process of financial intermediation can protect that economy when one of those financial sectors experiences a shock. But the mere existence of a diversified financial system may well insulate all aspects of a financial system from breakdown. Australia serves as an interesting test case in the most recent Asian financial turmoil. Despite its close trade and financial ties to Asia, the Australian economy exhibited few signs of contagion from contiguous economies, arguably because Australia already had well-developed capital markets as well as a sturdy banking system. But going further, it is plausible that the dividends of financial diversity extend to more normal times as well. It is not surprising that banking systems emerge as the first financial intermediary in market economies as economic integration intensifies. Banks can marshal scarce information about the creditworthiness of the borrower to guide decisions about the allocation of capital. The addition of distinct capital markets outside of banking systems is possible only if scarce real resources are devoted to building a financial infrastructure. It is a laborious process whose payoff is often experienced only decades later. It is thus difficult to initiate, especially in emerging economies that are struggling to edge above the poverty level. They perceive the need to concentrate on high short-term rates of return to capital rather than accept more moderate returns stretched over a longer horizon. We must continuously remind ourselves that financial infrastructure comprises a broad set of institutions whose functioning, like all else in a society, must be consistent with the underlying value system and hence its time preference. On the surface, financial infrastructure appears to be a strictly technical concern. It includes accounting standards that accurately portray the condition of the firm, legal systems that reliably provide for the protection of property and the enforcement of contracts, and bankruptcy provisions that lend assurance in advance as to how claims will be resolved in the inevitable result that some business decisions prove to be mistakes. Such an infrastructure in turn promotes transparency within enterprises and corporate governance procedures that will facilitate the trading of claims on businesses in open markets using standardized instruments rather than idiosyncratic bank loans. But the development of such institutions is almost invariably molded by the culture of a society. The antipathy to the “loss of face” in Asia makes it difficult to institute, for example, the bankruptcy procedures of western nations. And even the latter differ from one another owing to deep-seated differences in views of creditor-debtor relationships. Arguably the notion of property rights in today’s Russia is subliminally biased by a Soviet education that inculcated a highly negative view of individual property ownership. Corporate governance that defines the distribution of power, of course, invariably reflects the most profoundly held societal views of the appropriate interaction of parties in business transactions. It is thus not a simple matter to append financial infrastructure to an economy developed without it. Accordingly, full convergence across countries of domestic financial infrastructure or even of the international components of financial infrastructure is a very difficult task. Nonetheless, the competitive pressures toward convergence will be a formidable force in the future if, as I suspect, additional forms of financial intermediation will be increasingly seen as benefiting an economy that develops capital markets. Moreover, a broader financial infrastructure will also likely be seen as strengthening the environment for the banking system and enhancing its performance. The result almost surely will be a more robust and more efficient process of capital allocation, as a recent study by Ross Levine and Sara Zervos suggests.1 Its analysis reinforces the conclusion that financial market development improves economic performance, over and above the benefits offered by banking sector development alone. The results are consistent with the idea that financial markets and banks provide useful, but different, bundles of financial services. It is no coincidence that the lack of adequate accounting practices, bankruptcy provisions, and corporate governance have been mentioned as elements in several of the recent crises that so disrupted some emerging-market countries. Had these elements been present, along with the capital markets they would have supported, the consequences of the initial shocks of early 1997 may well have been quite different. It is noteworthy that the financial systems of most continental European countries escaped much of the turmoil of the past two years. And looking back over recent decades, we find fewer examples in continental Europe of banking crises sparked by real estate booms and busts or episodes of credit crunch of the sort I have mentioned in the United States and Japan. Until recently, the financial sectors of continental Europe were dominated by universal banks, and capital markets are still less well developed there than in the United States or the United Kingdom. The experiences of these universal banking systems may suggest that it is possible for some bank-based systems, when adequately supervised and grounded in a strong legal and regulatory framework, to function robustly. But these banking systems have also had substantial participation of publicly owned banks. These institutions rarely exhibit the dynamism and innovation that many private banks have employed for their, and their economies’, prosperity. Government participation often distorts the allocation of capital to its most productive uses and undermines the reliability of price signals. But at times when market adjustment processes might have proved inadequate to prevent a banking crisis, such a government presence in the banking system can provide implicit guarantees of resources to keep credit flowing, even if its direction is suboptimal. In Germany, for example, publicly controlled banking groups account for nearly 40% of the assets of all banks taken together. Elsewhere in Europe, the numbers are less but still sizable. In short, there is some evidence to suggest that insurance against destabilizing credit crises has been purchased with a less efficient utilization of capital. It is perhaps noteworthy that this realization has helped engender a downsizing of public ownership of commercial banks in Europe, coupled with rapid development of heretofore modest capital markets, changes which appear to be moving continental Europe’s financial system closer to the structure evident in Britain and the United States. Diverse capital markets, aside from acting as backup to the credit process in times of stress, compete with a banking system to lower financing costs for all borrowers in more normal circumstances. Over the decades, capital markets and banking systems have interacted to create, develop, and promote new instruments that improved the efficiency of capital creation and risk bearing in our economies. Products for the most part have arisen within the banking system, where they evolved from being specialized instruments for one borrower to having more standardized characteristics. At the point that standardization became sufficient, the product migrated to open capital markets, where trading expanded to a wider class of borrowers, tapping the savings of larger groups. Money market mutual funds, futures contracts, junk bonds, and asset-backed securities are all examples of this process at work. Ross Levine and Sara Zervos, “Stock Markets, Banks, and Economic Growth,” American Economic Review, vol. 88 (June 1998), pp. 537-558. Once capital markets and traded instruments came into existence, they offered banks new options for hedging their idiosyncratic risks and shifted their business from holding to originating loans. Bank trading, in turn, helped these markets to grow. The technology-driven innovations of recent years have facilitated the expansion of this process to a global scale. Positions taken by international investors within one country are now being hedged in the capital markets of another: so-called proxy hedging. But developments of the past two years have provided abundant evidence that where a domestic financial system is not sufficiently robust, the consequences for a real economy of participating in this new, complex global system can be most unwelcome. Improving deficiencies in domestic banking systems in emerging markets will help to limit the toll of the next financial disturbance on their real economies. But if, as I presume, diversity within the financial sector provides insurance against a financial problem turning into economy-wide distress, then steps to foster the development of capital markets in those economies should also have an especial urgency. And the difficult ground work for building the necessary financial infrastructure – improved accounting standards, bankruptcy procedures, legal frameworks and disclosure – will pay dividends of their own. The rapidly developing international financial system has clearly intensified competitive forces that have enhanced standards of living throughout most of the world. It is important that we develop domestic financial structures that facilitate and protect our international financial and trading systems that, aside from their periodic setbacks, have brought so much good.
|
board of governors of the federal reserve system
| 1,999 | 9 |
Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the Institute of International Bankers Annual Breakfast Dialogue, held in Washington, D.C. on 27 September 1999.
|
Mr Meyer highlights several issues on the Fed’s agenda for bank supervision and regulation Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the Institute of International Bankers Annual Breakfast Dialogue, held in Washington, D.C. on 27 September 1999. * * * I am delighted to have this opportunity to join with the other distinguished members of this panel before the Institute of International Bankers. My goal this morning is to highlight several issues on the Fed’s agenda for bank supervision and regulation. The first item never seems to leave the agenda: financial modernization. The issue has two sides: the market process and the legal framework. In the market-driven process, financial institutions are increasingly competing with each other – with banks seeking to expand the financial services they offer within banking organizations and, at the same time, non-bank financial institutions offering many bank-like products. This process has been under way for years, though constrained by the prevailing statutory limits. Legislation would allow the market process to evolve further and would, in addition, refine the supervisory and regulatory framework for the diversified financial services firms that would emerge as a result of the legislation. Many of you know the details of the disagreement between Treasury and the Federal Reserve over what that supervisory framework and permissible structure should look like. Rather than restating those details, let me simply express the hope that the efforts of the regulators and Congress will ultimately be successful – in this millennium – in moving the bill to passage. Being optimistic by nature, I turn to the challenges for both the banking system and the supervisors in adapting to the changes in banking and financial services that the passage of legislation would set in motion. Developing cooperation and coordination among the multiple supervisors of financial services firms would be one of the first. Financial modernization envisions a blend of functional and umbrella supervision. Depository institutions would continue to be supervised by their current bank or thrift regulators, with functional regulation of the new non-bank activities by their specialized regulators and umbrella supervision of the diversified financial services holding company by the Federal Reserve. This approach has a lot to recommend it but requires a high degree of cooperation and coordination between and among the bank supervisors and the functional regulators, specifically securities and insurance. This cooperation is essential to limit the regulatory burden otherwise associated with multiple regulators. Bank supervisors need to be well informed about the risks to the banking organization – and the depository institution in particular – from activities taking place in affiliates and, perhaps also in some cases, in operating subsidiaries. The umbrella supervisor will have to keep other regulators informed so they can do their job as well. We have had some experience as bank regulators in managing these kinds of communications. The Federal Reserve has worked hard with state banking authorities and with the OCC to increase the coordination of our examination and supervisory efforts, and we will continue to look for ways to improve this coordination as we also move to improve communication and coordination with functional regulators of the non-bank activities. The proposed blend of functional and umbrella supervision is one of several models for the supervision of diversified financial services firms that include a bank. The United Kingdom, Australia, Switzerland, and some other countries have moved to a very different structure that relies on a single consolidated financial services regulator and eliminates supervisory and regulatory responsibilities at the central bank. In this approach, the single regulator supervises bank and non-bank activities alike. Some see this structure as the wave of the future, given the blurring of distinctions between financial services. I wish our colleagues abroad the best of luck, but I think their legislatures have made a mistake. The argument is that in a market that has increasingly eroded differences among and between financial institutions, the uniqueness of banks has declined and the combining of financial institution regulators and adoption of similar, if not identical, regulations makes sense. Adding a powerful single regulator to a powerful and independent central bank would create an entity with significant authority outside the day-to-day direct purview of government, so governments have opted to combine the regulators and strip the supervisory and regulatory power from the central bank. But please note that they have continued to make their central banks responsible for financial stability. While macro financial tools and monetary policy may be sufficient to do that job most of the time, supervisory and regulatory policies have important economic and stability implications. Particularly in a crisis, a central bank without knowledge of the way markets actually operate – knowledge that can gained only by experience and hands-on contact with banking organizations – will be, if you will excuse an ex-professor’s metaphor, at risk of failing its final exam. As a result, I think the separation of central banking and supervision and regulation is dangerous. The supervisory and regulatory approach in the United States, as embodied in financial modernization legislation that the Federal Reserve supports, is quite different from the trend abroad. It provides an important role for the central bank in the process as an umbrella supervisor. It also makes an important distinction between the insured depository institution in the banking organization and its non-bank affiliates (or perhaps in some cases, its operating subsidiaries). Specifically, this approach envisions a less intense degree of supervision and regulation of the non-bank activities than for the depository institution itself. This reflects the role of the safety net in undermining market discipline of depository institutions and the role that supervisors must play to discipline risk-taking by them as a result. On the other hand, market discipline appears to operate more effectively with respect to non-bank financial institutions and, if the regulators do their job right, to non-bank activities within banking organizations. As non-bank activities grow within banking organizations, regulators must be alert to market inferences that non-bank activities within a banking organization are covered by an expanded safety net. Regulators must establish appropriate expectations with respect to the limits of the safety net, and confirm these expectations by their actions, to maximize market discipline of the non-bank activities and minimize the level of intrusion. The alternative, to impose bank-like regulation on nonbank activities, is exactly the wrong direction. We would rather work towards enhancing the effectiveness of market discipline on banks. I will return to the task of enhancing market discipline in a moment. While President McDonough will discuss the ongoing work of the Basel Committee on Banking Supervision to reform the Basel Accord, I want to emphasize its importance on our agenda. Most observers think, quite correctly, of the Basel Accord exclusively in terms of minimum capital standards. But the Basel consultative paper is consistent with the Federal Reserve’s long-standing emphasis that minimum capital standards are just a part of the framework for enhancing the safety and soundness of the banking system. Indeed, the paper encourages a rebalancing of emphasis towards supervision and market discipline. With respect to minimum regulatory capital standards, the most important task is to make the capital requirements for the banking book more risk sensitive. Banks are increasingly estimating the risk of their individual loans and allocating their economic capital accordingly. When internal capital allocations result in decidedly lower capital than required under the Accord, the choice is either to stop making such loans or to find ways to lower the capital charges applied to them, taking advantage of a variety of transactions, including securitizations and credit derivatives. Their capital arbitrage activities have shown banks to be much more adept in shedding capital requirements than in reducing risks. The solution is to reform the capital standards so they are more consistent with the underlying risks in the banking book and to reduce – if not eliminate – the incentive for and the ability to conduct regulatory capital arbitrage. That is the principle. The question is how to achieve this end. We look forward to the comments from the banking industry on proposals to use external ratings or banks’ own internal risk-rating systems to produce a more risk sensitive assessment of minimum capital requirements for the banking book. As useful as a more risk sensitive minimum capital standard would be, the fact is that most banks hold capital well in excess of the regulatory minimum. Therefore, one of the most important directions for bank supervision – at least with respect to the largest and most complex banks – is to place more emphasis on the supervisory assessment of the appropriate economic capital for a bank. The Basel consultative paper points in this direction, and the Federal Reserve is already moving to implement such an approach. Recent supervisory guidance stressed that banks should set a target for capital that should be appropriately aligned with the risk profile and risk-management capability of the bank. Bank supervisors will assess the adequacy of this target level of capital in their examinations of banks, and banks should defend their target level of capital to the market. This approach will give further impetus to the advances already well under way in the banking industry to refine the measurement of risk and the allocation of capital using internal risk ratings and internal models. This will hopefully yield an important synergy between banks and their supervisors: the better the risk measurement and management of banks, the better the opportunity of bank regulators and supervisors to lever off these practices in the setting of regulatory capital standards. Another part of the rebalancing is towards increased emphasis on market discipline. I have noted that the safety net dampens the incentive of the market to assess risks in banks. The solution is not to ignore the potential for market discipline, but rather to find ways to enhance its role in banking. At the Federal Reserve, we are beginning to pay closer attention to market measures of risk, including measures derived by comparing uninsured deposit rates, equity prices, and subordinated debt yields. These measures may prove useful in alerting supervisors to changes in the market perception of risk. I have on previous occasions discussed the potential of a mandatory subordinated debt requirement for increasing the incentive of market participants – in this case subordinated debt holders – to monitor the risk-taking of banks. While we have no plans to move forward with such a requirement at this time, I continue to find this proposal intriguing, and we will continue to study its potential usefulness. Market discipline is reinforced by enhanced public disclosure of risk positions and risk-management capabilities. We have recently assessed the adequacy of disclosure and believe there are opportunities for banks – especially the large, complex banking organizations with a large share of their assets funded by uninsured liabilities – to reveal more information. One possibility would be to have an industry-led task force identify best-practice with respect to bank disclosure and provide banks with an incentive to move towards this best practice frontier. Any such effort should be flexible, not necessarily dictating the same disclosures for all banks but allowing banks to choose the most effective ways to disclose – through both qualitative and quantitative information. In our supervisory practices and regulatory standards we draw a distinction between the largest and most complex banking organizations (what we refer to as LCBOs) and the overwhelming majority of small and medium-sized banks. For example, while I applaud the direction in the Basel consultative paper towards a more sophisticated and more risk sensitive minimum capital standard, I do not believe this more complex system is warranted or appropriate for the overwhelming majority of US banks. We are therefore thinking about what the banking agencies are calling bifurcation, using a simpler capital standard for the thousands of smaller and medium-sized banks as we adjust our capital requirements for the large complex organizations to better fit their risk profiles. We already have different intensities of supervisory oversight at large complex banks and at small and medium-sized banks, and have singled out a very small number of the largest and most complex banks for a program that features an enhanced focus on risk management and internal controls, the use of internal credit-rating systems, and internal analyses of capital in relation to risk. This trend will continue. Let me close with a comment about a possible relaxation in credit discipline that our supervisory reviews have detected at some banks. Now, don’t mistake me; loan portfolios remain sound overall. But loans falling into criticized categories have been rising modestly at some banks over the past several quarters. That’s troubling because the increase has surfaced despite the continuation of favorable economic and financial conditions in the United States. It appears the vulnerability of these loans was heightened in some cases by weak underwriting practices. In these cases, a recurring theme has emerged. Lenders are relying too much on the continuation of good times. They’re assuming a very optimistic view of their borrowers’ operating prospects and that their borrowers always will have ready access to financial markets. And sometimes they’re failing to subject loans to meaningful “stress tests” that would, for instance, tell them if their borrower could withstand an unexpected shock to operating revenue. These are the kinds of developments that tend to get the attention of bank supervisors, and ought to get the attention of banks and other lenders. I have raised a number of questions and perhaps answered a few. Thank you for the opportunity to do so.
|
board of governors of the federal reserve system
| 1,999 | 9 |
Remarks by Mr Roger W Ferguson Jr, Member of the Board of Governors of the US Federal Reserve System and Chairman of the Joint Year 2000 Council, before the Wallenberg Forum at Georgetown University, Washington, D.C., on 28 September 1999.
|
Mr Ferguson assesses financial sector Y2K readiness in the United States and abroad with less than 100 days to go Remarks by Mr Roger W Ferguson Jr, Member of the Board of Governors of the US Federal Reserve System and Chairman of the Joint Year 2000 Council, before the Wallenberg Forum at Georgetown University, Washington, D.C., on 28 September 1999. * * * Year 2000: ninety-four days and counting My topic today is a timely one – the Year 2000. My comments will focus on the readiness of the financial services industry in the United States and the significant work and coordination that have occurred globally to prepare for the century change. The century date change poses considerable risk and is a major challenge for all sectors both public and private. Although US financial market authorities established specific and aggressive deadlines for their industries to prepare for the century date change, in the final analysis, the boards of directors and senior management of firms are responsible for ensuring that their organizations are prepared for the Year 2000. We turn to them, ultimately, to ensure that their firms are able to conduct business and provide uninterrupted services after the century date change. Increased confidence in readiness of US financial industry and other sectors With only ninety-four days remaining before the century rollover, I believe that the US financial sector is ready. My assessment is based on publicly available facts and information developed by supervisors and industry participants. More than 99% of the approximately 22,000 federally insured depository institutions and credit unions have demonstrated to examining supervisors that they have completed preparations for the Year 2000, tested their mission-critical systems and put them into production. Securities and futures brokers, dealers, and markets have completed or are completing Y2K-readiness certifications, and both industries have participated in large-scale “street” tests during which literally hundreds of thousands of forward-dated transactions were processed on exchanges and then cleared and settled using automated systems that simulate the century rollover at their depositories and clearing houses. These tests revealed Y2K errors at a level of something less than one-tenth of 1%, and all were quickly corrected. In fact, the testing tools now being used are so thorough that in many cases non-Y2K program bugs have been identified in existing programs and systems, and I am happy to report that they have been eradicated as well. Retail and wholesale payment systems – a vital part of the financial sector and one in which the Federal Reserve is a major service provider – are engaged in a similar process. I can assure you that the Federal Reserve has not spared any effort in preparing its internal systems and the financial services and products we provide to financial institutions for the Year 2000. We have completed Y2K preparations for our services and products, and in June 1998 we opened a testing facility for our customers. To date, more than 9,000 financial institutions have tested the services they use with the Federal Reserve. These represent all of our major customers in terms of transaction volume and dollar amount of the items processed through the Federal Reserve. We also have tested the automated payment services we provide to federal agencies such as the Social Security Administration to ensure that banks can receive government payments and then post the deposits to their customers’ accounts. The New York Clearing House, in particular, and other private commercial entities that process wholesale and retail payments have followed testing programs similar to ours. No one can say with certainty that there won’t be any problems or disruptions during the century rollover. However, based upon the information I have shared with you, we expect that any disruptions or glitches in the United States that do occur will be minor and of limited duration. Moreover, because there is an expectation that something somewhere will go wrong, the financial sector – from the regulators to the markets and payment systems to the smallest introducing broker or bank branch – is preparing contingency plans. I would also like to emphasize that a percentage of automated systems are down every day without causing serious disruptions to commercial transactions and markets. For example, 1–2% of ATM machines in the United States are down at any given moment – some simply because they are out of paper – yet consumers know to go down the block to another machine or into the bank branch or local supermarket to obtain the cash they need. Even more serious disruptions periodically occur: The New York Stock Exchange and, only last month, the Chicago Board of Trade computers have experienced glitches that caused their markets to close temporarily without causing serious disruption to the US financial markets. In all of these situations, Americans react with typical aplomb: they prioritize and address the most serious safety and well-being issues first, and they usually are willing to tolerate some inconveniences and delays related to less-critical needs. Increase in readiness information and readiness of other domestic sectors As I said earlier, my assessment of the financial services sector is based on publicly available facts and information. Even as late as this spring, Y2K information about firms and industries was largely sketchy or incomplete throughout the world. In many countries, few firms or sectors were willing to provide information about the Year 2000 process or their status relative to national and international benchmarks. Failure to disclose caused considerable concern in markets. Market participants began to assume that firms and sectors that were not making some type of self disclosure probably were seriously behind in their preparations for the Year 2000. This assumption engendered the potential for overreaction by market participants and consumers, including the potential for withdrawal from markets and commercial relationships and other rational and irrational risk-mitigation techniques. I believe that the potential for overreaction has been greatly reduced in recent months because of a dramatic improvement in disclosure and confirmation of readiness through multiparty testing. First, in the United States, the President’s Council on Year 2000 Conversion, led by John Koskinen, forged unique and successful cooperative partnerships between critical public agencies and related private sectors and made those sectors accountable to the American public through the quarterly release of sector assessment reports. The President’s Council, established in February 1998, is made up of more than 30 major federal agencies that act as sector coordinators in promoting Year 2000 public and private sector action within their respective policy areas. Quite simply, the President’s Council has spurred the government and whole industries to coordinate Y2K preparations, set benchmark dates for readiness, and organize and report the results of inter and cross-industry tests. The electric power and telecommunication industries, which are critical to the operation of the financial services industry, are excellent examples of this achievement. Neither industry is supervised in the way that the financial services industry is, yet their umbrella agencies – the Department of Energy and the Federal Communications Commission – were able to energize industry-led groups such as the North American Electric Reliability Council (NERC) and the Network Reliability and Interoperability Council (NRIC). The results of their efforts can be read in the assessment report released by the President’s Council last month. As of June 1999, electric power distribution companies’ serving 96% of the nation’s electricity needs were ready for the Year 2000. Similarly, as of July 1999, long-distance telecommunications carriers controlling 92% of domestic calls were 99% Y2K compliant. Second, regulators and private-sector firms have been exerting pressure on market participants and commercial firms to provide information about Y2K readiness. Banks and broker-dealers have been required by their regulators to communicate with customers about their Y2K programs and readiness. Moreover, financial institutions, which have a duty to assess and manage the Year 2000 risk, have been seeking Y2K disclosure from major customers and counterparties. If these entities are publicly traded companies, they are required by the SEC to address Y2K readiness in their quarterly filings. Private-sector groups, such as the Global Year 2000 Co-ordinating Group made up of global banking and financial services firms, have been very active in prodding market participants to publish information about their own readiness. The Global 2000 Group has also been instrumental in encouraging countries to produce information about the readiness of key sectors within their borders. Third, we’ve seen increasing efforts by the media to provide factual and balanced reports on the Year 2000. I think you’ll agree that the press is now reporting the good news as well as the “what if” pieces. The Federal Reserve and our sister agencies plan to engage the media in an ongoing conversation about the financial services industry through the rollover period. We expect intensive media coverage during the rollover period, and the President’s Council will be running a national Information Coordination Center, supplemented by reports from agencies and industry members, that will provide the public with accurate, timely, and complete information about the operation of critical sectors in the United States. These disclosure initiatives, assisted by legislation that avoids liability for Year 2000 disclosure statements made in good faith, literally have opened the gates of information and offer a powerful antidote to any Y2K gloom-and-doom stories generated by the media. Year 2000 market indicators and Federal Reserve monetary preparations Over the next few months and through the first part of the new year, the Federal Reserve, like other central banks, may be facing some unusual, but not unmanageable, challenges in carrying out its responsibility to meet market demands for currency, reserves, and liquidity more generally. We already have seen signs of heightened demands for liquidity and safety in the United States. The Federal Reserve expects banks and other financial intermediaries to have reasonable plans in place to manage cash and liquidity and provide for contingencies over the century date change. However, we also recognize our responsibility to assure that adequate overall levels of liquidity are available and to provide a backstop to the financial system. The Federal Reserve has a number of tools available to effectuate monetary policy and to satisfy market liquidity needs. For example, we use our open market operations to provide liquidity by entering the market to buy or sell government and agency securities. Recently, we created several new tools to help fine-tune our open market operations and reassure market participants that adequate liquidity will be available when needed. First, we lengthened the maximum term of our repurchase agreements, up to ninety from sixty days; this is a permanent change in our operations. Second, we are willing to accept a broader range of collateral in repurchase transactions, such as pass-through mortgage securities of government-sponsored enterprises. Third, we will be selling options on overnight repurchase agreement transactions for exercise on specific days in December 1999 and January 2000; the details on this are still being worked out. These latter programs have been authorized only for the Y2K period. We have seen recent butterfly spreads and other measures of Y2K pressures in US markets respond positively to the announcement of these tools. Even with the flexibility provided by these tools, if the markets become more volatile, it may be difficult to forecast aggregate reserve demand and supply, engendering the potential for an unexpected shortfall in reserves. Moreover, it is possible that the distribution of liquidity will become uneven – some banks may receive increased deposits and be flush with funds while others may experience unexpected shortfalls. And many banks could experience unusual loan demands related to the Y2K needs of their customers. Broadly speaking, uncertainties about Y2K have given rise to a general reluctance among lenders to extend unsecured credit over the year-end. At the same time, borrowers are trying to lock-in funding now for the year-end rather than face the possibility of high interest rates or market disruptions. To help meet unusual funding and liquidity needs during the period around the century date change, the Federal Reserve has created a special liquidity facility as an adjunct to its discount window programs. The special liquidity facility will be open from 1 October 1999 through 7 April 2000. It will be available to depository institutions operating in the United States and in sound financial condition. Loans must be adequately collateralized and will be made at a penalty rate of 150 basis points above the FOMC’s targeted federal funds rate. In contrast to subsidized adjustment credit, which will still be available, borrowers under the SLF will not be required to first seek credit from market sources, and the usage of borrowed funds will not be limited or monitored. Moreover, loans can be outstanding for any period while the facility is open. The Federal Reserve’s special liquidity facility is similar to the so-called “Lombard” credit facilities offered by a number of European central banks. We do not intend for any supervisory or market stigma to be attached to use of the facility; if it does, then the potential for this facility to ease Y2K liquidity needs may not be fully realized. If banks are willing to utilize the facility, it should help to maintain orderly markets and to cap the federal funds rate in those markets. One final point about liquidity – as I said above, we expect payment mechanisms to function smoothly. However, it is possible that currency demands will increase over the next few months. The Federal Reserve is prepared to meet any currency demands that may arise, and we are taking a number of steps to ensure that cash is stored at numerous sites around the country to allow banks to meet any sudden or unexpected spikes in the currency needs of their customers. International efforts and readiness Abroad on the international front, we have seen tremendous progress in the awareness of the Year 2000 issue. Most countries now have Y2K national coordinators and are providing more information about their efforts. Again, the financial services sector abroad leads all others in preparedness. The international telecommunications industry also appears ready, and service should be reliable between major cities. This progress again can be traced to a number of public and private-sector initiatives. United Nations Cooperation Center Earlier this year, the United Nations Committee on Informatics held two highly successful meetings for national Y2K coordinators. The meetings were designed to promote awareness and provide tools for coordinators to use to organize domestic Year 2000 programs. At the second meeting held in June, more than 170 countries were represented – more countries than have ever attended a special UN meeting. The UN is encouraging countries to issue disclosures about their Year 2000 programs and progress and has established a Year 2000 Cooperation Center, which is publishing readiness country reports disclosed by countries on a website. The Center will also publish reports on the impact of the date change on critical sectors within countries during the rollover period. The website provides information on the status of key sectors within a country including the public infrastructure. Joint Year 2000 Council In April 1998, a group of authorities on international financial market – the Joint Year 2000 Council – was established by the Basel Committee on Banking Supervision, the Committee on Payment and Settlement Systems, the International Association of Insurance Supervisors, and the International Organization of Securities Commissions. The council has provided the key forum for Y2K communications among financial market authorities around the globe. To date, more than 100 countries have participated in our activities. I chair the Joint Council, and I am very proud of what we have achieved over the past year. The Joint Council has issued a number of guidance papers to assist regulators in organizing the Year 2000 efforts of the financial services industries within their countries. These papers address the scope and impact of the Year 2000 challenge, the independent assessment of the preparedness of financial institutions, testing, information sharing, and contingency planning. In addition, the Joint Council issues bulletins summarizing recent developments and best practices for supervisors. Most important, the Joint Council is completing a second round of regional meetings that focus on contingency planning, event management, and public communications strategies. These meetings provide an excellent opportunity for supervisors to discuss common interests within specific geographic areas, to share information, and to coordinate regional plans in anticipation of the rollover. The Joint Council also serves as a point of contact for various national and international private-sector initiatives. In this regard, it has established an External Consultative Committee to enhance information-sharing between the public and private sectors. The ECC includes representatives of internationally oriented organizations, including the International Monetary Fund, the World Bank, and the major cross-border financial utilities, such as S.W.I.F.T., for making international payments and settling transactions. The council is now in the process of establishing information sharing facilities for financial market authorities to use during the rollover period. Private sector activity In the international financial community, we have a highly active private sector that has been effective in promoting the readiness of financial services firms and markets around the globe. As I mentioned earlier, private sector groups, such as the Global 2000 Coordinating Group, have been educating private sector executives and public sector officials about the international interest in the readiness of key sectors in their countries that are critical to international commerce, such as telecommunication, power, financial service, shipping and transportation. No one can declare with certainty how the millennium rollover will unfold internationally, and much of my information is anecdotal. However, the financial service sector is generally perceived to be better prepared than other sectors in almost every country. In general, I can report that the financial firms of the developed countries, like those in the United States, either are, or appear to be making good progress toward being, prepared. Similarly, the financial institutions of a number of transitional economies are well advanced. The financial institutions that are thought to have the furthest to go, in general, are those in countries that are least dependent on technology. They have the greatest experience with frequent disruptions of the type that one might expect during the changeover period and can most easily return to manual workarounds or other contingency plans. During 1999, the financial community organized a series of very successful cross-border payment systems tests. Thirty-four separate national and international payments systems in nineteen countries participated. More than 500 financial institutions successfully completed simulated Year 2000 transactions on systems that were forward dated to simulate the rollover. For these tests to be successful, the participants had to have completed all necessary Year 2000 preparations to their internal systems. It therefore seems highly unlikely that the payment systems will be the source of instability during the century date change. Conclusion Although much work has been done within the United States and around the globe in anticipation of the century date change, we should not be complacent. There is still work to be done in terms of contingency planning and public communication. The Federal Reserve will continue its ongoing monitoring of progress. We also intend to have close contact with the markets and financial institutions through the date change. While we cannot know with certainty what the century rollover will bring, we should, based on what we know today, experience a smooth transition, perhaps even business as usual.
|
board of governors of the federal reserve system
| 1,999 | 9 |
Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the International Finance Conference, Federal Reserve Bank of Chicago, held on 1 October 1999.
|
Mr Meyer reports on lessons from recent global financial crises Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the International Finance Conference, Federal Reserve Bank of Chicago, held on 1 October 1999. * * * On the principle that mistakes teach us as much as, if not more than, successes, I will take this opportunity to consider what can be learned to improve bank supervision and regulation from the financial crises that afflicted so many economies over the past 2½ years. I will consider two episodes – the Asian financial crisis and the financial market turmoil surrounding the near-bankruptcy of the hedge fund Long-Term Capital Management (LTCM). Because these episodes are too recent and too complicated to draw many firm conclusions that lead to concrete policy recommendations, part of my objective today is to identify some areas where I believe that further study would be particularly productive. The two crises The details of these two episodes are familiar to everyone here. The floating of the Thai baht in July 1997 marked the onset of a period of market turbulence associated with the halting of new funds to and, in most cases, the flight of existing money from, many economies in Southeast Asia. Because many entities in those nations relied on short-term funding in foreign currency for their ongoing operations, the drying up of funding from global financial markets quickly placed serious strains on them. And because the lines between the private and official sectors were often drawn imprecisely, these funding problems for firms soon became the burdens of national governments. Such pressures, unfortunately, exposed numerous and substantial flaws in some of these financial systems, including lax lending policies, substantial mismatches in the maturities and foreign exchange denominations of assets and liabilities, seriously deficient standards for disclosure of basic private financial information, hedges based on erroneous presumptions about the correlations among returns, the failure to monitor ongoing loan performance, and the inadequacy of reserves against potential losses. In autumn 1998, we in the major industrial countries were reminded that emerging market economies did not have a monopoly on financial institutions that were inadequate in the task of assessing risks. The effective default of the government of Russia on some of its obligations and the travails of LTCM in rebuilding its capital in the face of enormous trading losses – occurring as they did so soon after the crisis in Asia – triggered a generalized reassessment of risk-taking by investors and market makers. Interest rate risk spreads widened into ranges normally associated with the troughs of recessions, and reductions in the liquidity of even government securities markets suggested a marked contraction of trading activity. In retrospect, the counterparties of LTCM had underestimated the risks associated with that firm’s strategy, both in terms of the leverage undertaken and the size and scope of open positions, some of which were in illiquid markets or ones that would become illiquid if attempts were made in scale to close out positions. Many firms dealing with LTCM quickly found themselves with a considerably greater potential exposure to loss than they had bargained for, both directly on their credits to LTCM and – should the hedge fund have failed and prices moved as its positions were liquidated – indirectly in their own trading books and on some of their other outstanding credits. Some tentative lessons and responses I take three lessons away from this experience and suggest a like number of supervisory and regulatory responses. As for the lessons, first, measures of direct risk-taking can provide misleading assessments of overall exposure in an environment with so many interconnections. The direct lending exposure of most global financial institutions to Thailand, Malaysia, and Korea was limited at the time. However, proxy hedging of country risks in international financial markets propagated shocks across national borders beyond that called for by direct trade and financial linkages alone, spreading the initial problems in Asia to many other markets. Simply, those entities with direct exposure to troubled credits sometimes took offsetting hedging positions in obligations of other economies that traded in deeper and more liquid markets on the theory that the usual correlation among returns in that asset class would trim at least some of their potential losses. In the process, those deeper markets used for hedging purposes, including ones in Hong Kong, Australia, Brazil, and Mexico, suffered their own downdrafts at the peak of the Asian crisis. Similarly, concerns about the potential fire sale of LTCM’s assets, as well as the efforts of counterparties to rebalance their positions in advance of possible failure of the hedge fund, produced wide swings in many financial prices and contributed to a drying up of market liquidity. As a result, institutions with no direct exposure to Asian economies, Russia, or LTCM found themselves caught flat-footed. Second, regulators and industry participants in industrial countries have reason to be proud that improvements in capital, regulatory structure, and risk management over the years allowed depository institutions to weather the storms without substantial adverse effects. Exposures to emerging market economies were much more limited in this episode when compared with the debt crisis of the early 1980s. For instance, on the eve of the debt crisis in 1982, US banks’ direct exposure to Latin America ran about 125% of their capital, and for the largest banks, it was more than 180% of capital. In contrast, in mid-1997, US banks’ direct credit exposure to all emerging markets was around one-third of their capital. These lower exposures reflect, to an important degree, a better management of risk that recognizes the importance of diversification across asset classes. Third, regulators and industry participants also have reason to be humble. Few would have predicted that the floating of the Thai baht would topple dominoes all over the region with such force. And the almost universally accepted opinion of the risk-taking prowess of LTCM proved mistaken in retrospect. These examples should serve as a reminder that we will not be able to know where, when, or with how much force the next crisis will hit. However, one of the surest lessons of history is that there will be a next crisis, a crisis that will share some attributes of the ones that came before while offering new challenges in its own right. And that brings us to the appropriate responses: first, in preparation for the next round of problems, supervisors and regulators should reinforce efforts to get the basics right. For all the talk of financial wizardry that allows the unbundling and transferring of risks and the lightening speeds of transmission that occupy so much attention, I would like to remind everyone that, by and large, the mistakes of the past few years were rather humdrum. In Asia, there were widespread failures of supervision and regulation, including the failure to enforce limits on lending to individual entities, to appreciate the implications of over-reliance on potentially changeable short-term sources of funding, to evaluate repayment risk on a timely basis, and to react quickly as problems emerged. In industrial countries, it was a widespread misassessment of counterparty risk. Whether lulled by the collateral provided for credit exposure by each daily marking to market or the lofty reputation of the principals of the firm, counterparties failed to provide an effective check on the leverage of LTCM. That said, we should appreciate that in the United States and Europe, bank supervision and regulatory capital standards worked well in protecting the banking system. Thus, a second item on my list of responses to the recent financial crisis is that work must continue to determine the incremental improvements that can be put in place within the existing structure, especially including supplementing those efforts with an increased reliance on market discipline. And a third important response is to recognize that these incremental improvements will be more drastic for some institutions and less drastic for others. The general principle, I think, is that the complexity and focus of both the supervisory examination and the capital requirements should be determined by the complexity, diversity, and perhaps the scale of the organization being examined. This suggests not only different approaches across nations but also different approaches within nations. A one-size-fits-all supervisory and regulatory framework is simply inconsistent with the existing and evolving banking structure. Banks are just too different, with different risk profiles, risk controls, strategies, and approaches to managing risks to be supervised and regulated by one yardstick. Similar institutions should be supervised and regulated similarly, and different institutions differently. The consultative document released by the Basel supervisors in June recognizes this multitrack approach Agendas for action By the United States In the United States, this suggests that the current structure of supervision and regulation – including minimum capital rules – probably does not have to be changed very much for most banks and perhaps can even be simplified for some. But there is a small subset of megabanks, who through growth and consolidation have reached a scale and diversity that would threaten the stability of financial markets around the world in the event of their failure – or even if they faced severe stress under certain circumstances. For these larger, complex banking organizations, the Federal Reserve has already begun a different supervisory focus, and we believe that further modifications are required in both that approach and capital regulations. We have chosen about thirty US banking organizations – about one-third subsidiaries of foreign banks, by the way – whose scale, complexity, and diversity distinguish them from other organizations, especially the role that they play in US and world financial markets. For each of these large, complex banking organizations – creatively known as LCBOs in supervisory circles – we have established dedicated teams of examiners, assisted by roving teams of specialists in payments systems, risk management, information technology, financial engineering, and modeling. Each examiner team is headed by a Central Point of Contact, and that team is dedicated full-time to understanding and supervising everything about that organization – especially its risk profile, risk controls, and strategy. Just as each institution is different, the team is supplemented by different experts as needed. As these institutions grow more sophisticated and complex, our challenge is to attempt to develop the skills to evaluate their activities. But scale and complexity imply that the supervisor cannot alone accomplish the job, or at least cannot do so without a degree of intrusion and network of rules and regulations that would be simply inconsistent with the need for flexibility and rapid response by financial businesses operating in an increasingly complex market environment. We have no choice, therefore, but to rely increasingly on market discipline as both a supplement to supervision and regulation and as a source of information to the supervisors. Such market discipline – which in practice can best be applied only to those large institutions that rely significantly on uninsured on- and off-balance-sheet liabilities to finance their activities – requires a larger scale and scope of public disclosure than so far has characterized banking, even with the substantial disclosures already made by large US banks. Information on the risk categories of credit exposure, credit concentration, and exposure retained in securitizations is an example of the kind of disclosure that may be required if the cost and availability of funding is to truly reflect the riskiness of individual institutions. Capital regulation for LCBOs also must change. Best-practice banks in the United States have already begun the process of risk-categorizing their portfolios and using historical data to establish internal capital allocations, loan loss reserving, and pricing. Our examiners have been instructed to begin evaluating these systems and urging their improvement. We have also begun work on how best to tie the required minimum capital regulation for an individual large bank directly into its own internal risk-profiling system, rather than to one or even multiple externally defined “risk buckets”. This isn’t going to be easy. The US regulators and our colleagues abroad are hard at work on how to do it. At the outset, I am sure, the approach will be relatively simple, but as both banks and supervisors learn, it will become more sophisticated. The framework for LCBOs in the United States, then, will be based on the three pillars discussed in the Basel Supervisors consultative paper: market discipline, supervision, and capital regulation. My personal view is that in the near term we will have to rely more on supervision, supplemented increasingly by market discipline, as we develop and deploy the revised capital regulations. By emerging market economies In emerging market economies, national authorities have to develop the expertise in supervision and regulation to monitor activities of complex financial organizations and foster enhanced risk management practices in their local industry. But we must appreciate that such experience accumulates only over time. In the interim, national authorities may well want to consider supplementing their supervisory and regulatory framework with quantitative restrictions that limit risk-taking. When the skills to interpret regulations flexibly are in short supply, national authorities might prefer to bind themselves to simple rules. Such rules presumably would be structured to prevent the outsized behaviors that in the past have preceded financial crises, such as extremely rapid growth in lending for property development or a large share of real estate loans on depositories’ balance sheets. I would also like to point out that the institutional depth to both manage and examine complex banking organizations need not always be home grown. While I appreciate the natural sensitivities of countries trying to build their own economic capabilities, emerging market economies seeking to strengthen their own financial systems should not restrict – and, indeed, may want to encourage – entry by foreign banks. Foreign banks will bring with them the human and financial capital that can raise the level of financial expertise for the entire industry, to the benefit of local banking services. The presence of such global banks will also foster the development of complementary institutions, such as credit-rating agencies and accounting firms, that will be valuable for both local institutions and national supervisors. Those foreign banks also will likely have access to strong parents, implying that the resources that can be applied to quelling financial turmoil will extend beyond the limits of the national central bank. By international organizations We must also recognize that the agendas for strengthening banking systems in industrial and emerging market economies are intertwined. There are important cooperative advances to be made, starting with progress on monitoring compliance with international standards. But all the effort in establishing standards and guidelines will go for naught if there are not clear, comprehensive procedures for monitoring the performance of banks in meeting them and incentives for adopting them. Part of this, no doubt, will require strengthening cooperation among national supervisors. Many financial institutions have increased their global reach, and those who have not are still affected by development abroad. Supervisors must therefore also strengthen their connections outside their national markets. While the events of the last few years that have buffeted world markets caught us by surprise, they were not a total surprise. With the benefit of hindsight, we can pick out warning signals in some countries that were missed in advance of the Asian crisis, including an overreliance on leverage, a troubling buildup of short-term financing, and an overvalued exchange rate. With LTCM, there was similar excessive leverage and reliance on short-term financing arrangements. Going forward, international organizations and national authorities will have to invest more resources in monitoring markets for signs of stress. While there is not a single indicator of banking or balance-of-payment crises, the tracking of financial market prices in many markets and financial flows across borders should help to identify trouble spots. Where appropriate, national authorities should consider broadening the information they collect. Complementary to these efforts, national authorities can take steps to facilitate transparency within markets. The key to avoiding excessive leverage is the market discipline that should be provided by market participants’ creditors and counterparties. But market discipline works well only if counterparties share sufficient information to allow reliable assessments of their risk profiles. Supervisors need to ensure that, before establishing credit relationships, regulated entities have a clear picture of a counterparty’s risk profile and have ensured that information relevant to that relationship will be available on a sufficiently timely and ongoing basis. Public disclosure also has an important role to play, and authorities should make sure that appropriate requirements are in place. To be sure, public disclosure is unlikely to be sufficiently timely or detailed to meet the needs of creditors. Still, it is essential to protect retail depositors and investors, and it provides a standardized framework from which customized bilateral disclosures can be drawn and elaborated. Lastly, industrial countries have the responsibility to assist in the training of supervisors in emerging market economies, an area in which, I am pleased to say, we in the Federal Reserve System have been active for a while. We cannot afford not to take this responsibility, and in this regard, virtue is more than its own reward. We benefit in such technical assistance by strengthening our contacts with supervisors abroad, which is important when examining internationally active institutions based here at home, and by reducing the potential for adverse shocks from abroad. Issues for further consideration As I noted, one of the contributing factors to Asian financial distress was the ill-considered buildup of short-term foreign currency borrowing by banks in these nations from banks in industrial countries. Some have attributed such behavior to the effects of an inappropriately low capital charge in the Basel Accord for short-term interbank credit extensions. They argue that the experience requires an increase in capital charges in order to effectively control the quantity of interbank loans. I do believe that we, in fact, should question the treatment of interbank credit by the Accord. Credit risk, in my view, ought to be determined by the specific circumstances of the individual borrower. If that borrower is really the sovereign, let us be up front about it. If the borrower is really a bank, its capital classification should not be determined simply by its home country. The Capital Accord should not lend its support to the unfortunately traditional presumption of external creditors that countries need to stand behind all external borrowings by their banks. So, by all means, I am firmly in the camp for adjusting interbank capital charges. But, I doubt that an alignment of risk weights for interbank credit itself would have had much effect on the borrowing and lending behavior we saw in the runup to the Asian financial crisis. This is not to say that interbank lending has not been a problem. But the problem may be us. That is, we have created a significant moral hazard by, in effect, making lending banks whole, and even increasing their returns, when the borrowing banks in emerging nations are unable to meet their obligations. When confronted with the reality, no authority has been willing to face the implications of bank defaults on the losses of the lending banks or the implications of the unavailability of new bank credit for the nation in default. But our perfectly rational short-run decisions create the incentive for lending banks to do the same thing again, and again, and again. Returns are reasonably high, and risk to them is reasonably low. The issue is clearly the short-run-long-run tradeoff and I do not have much to add to the argument about interbank lending other than that some change in the architecture and process must be made or the cycle will continue; and that change must involve some genuine risk-taking by the lending banks if we are to reduce, if not eliminate, the moral hazard we have created. Conclusion A final word about short-run-long-run tradeoffs. Some have argued that both risk-sensitive capital charges and market discipline will be pro-cyclical. The rationale, as I understand it, is that following a peak, as economic conditions deteriorate, evaluations of risk will change, and risk premiums will rise. As more potential borrowers are viewed as riskier, banks will be even less willing to lend so as to avoid facing higher capital charges or higher borrowing costs or less availability in the market. In an improving economy, the opposite occurs. As one of my colleagues states, the problem with market discipline and risk-based capital is that they work. If and as they work, we may well observe what the critics note. But, that short-run effect has to be evaluated against the long run, and a judgment reached about the terms of the tradeoff. For in the long run, both market discipline and risk-based capital charges affect ex ante risk-appetites because lenders can calculate the likely impact of their actions. The resultant change in behavior should reduce the amplitude of cycles, and any resultant pro-cyclicality has to be evaluated against that backdrop. Or, more generally, short runs have to be evaluated against the backdrop of long runs, regardless of Mr Keynes’s unfortunate observations about the latter.
|
board of governors of the federal reserve system
| 1,999 | 10 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the American Bankers Association, Phoenix, Arizona on 11 October 1999.
|
Mr Greenspan looks at the evolution of bank supervision Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the American Bankers Association, Phoenix, Arizona on 11 October 1999. * * * I appreciate once again being invited to participate in your annual convention. The convention theme, “Creating Sustainable Competitive Advantage,” is well chosen for an industry that continues to be characterized by dramatic change. This morning I should like to address one aspect of response to change – the evolution of bank supervision. As the theme of your convention suggests, the economic landscape is continually evolving. To remain competitive, individual banks must adapt, and they have. It is no less natural to expect that supervisory policies and practices also would evolve and adapt over time. Banking supervision is responding, though admittedly not as rapidly as the industry itself. This is not necessarily all bad. The physician’s admonition “First do no harm” is a desirable starting point for bank supervisors as well. Nevertheless, significant changes are in the pipeline. Today I would like to sketch the framework that is now being developed, growing out of work at the Federal Reserve, at other US banking agencies, and by our colleagues abroad. The outline of this framework was presented in a consultative document released by the Basel Supervisors Committee in June. The details are preliminary, but the concepts are beginning to congeal and deserve your close attention, especially if you wish to influence the eventual outcome of the deliberations. The first concept I would highlight is that the scope and complexity of prudential policies should conform to the scope and complexity of the bank entities to which they are applied. This means, in practice, that few changes to the present system are necessary for the vast majority of banks in the United States. More broadly, however, a one-size-fits-all approach to regulation and supervision is inefficient and, frankly, untenable in a world in which banks vary dramatically in terms of size, business mix, and appetite for risk. Even among the largest banks, no two institutions have exactly the same risk profiles, risk controls, or organizational and management structure. Accordingly, prudential policies need to be customized for each institution: the more complex an institution’s business activities, the more sophisticated must be our approach to prudential oversight. The need for a multi-track approach to prudential oversight is particularly evident as we face the reality that the megabanks being formed by growth and consolidation are increasingly complex entities that create the potential for unusually large systemic risks in the national and international economy should they fail. No central bank can fulfill its ultimate responsibilities without endeavoring to ensure that the oversight of such entities is consistent with those potential risks. At the same time, policymakers must be sensitive to the tradeoffs between more detailed supervision and regulation, on the one hand, and moral hazard and the smothering of innovation and competitive response, on the other. Heavier supervision and regulation designed to reduce systemic risk would likely lead to the virtual abdication of risk evaluation by creditors of such entities, who could – in such an environment – rely almost totally on the authorities to discipline and protect the bank. The resultant reduction in market discipline would, in turn, increase the risks in the banking system, quite the opposite of what is intended. Such a heavier hand would also blunt the ability of US banks to respond to crisis events. Increased government regulation is inconsistent with a banking system that can respond to the kinds of changes that have characterized recent years, changes that are expected to accelerate in the years ahead. The desirability of limiting moral hazard, and of minimizing the risks of overly burdensome supervision and regulation, has motivated those of us associated with the Basel exercise to propose a three-pillared approach to prudential oversight. This approach emphasizes, and seeks to strengthen, market discipline, supervision, and minimum capital regulation. Market discipline In trying to balance the necessary tradeoffs, and in contemplating the growing complexity of our largest banking organizations, it seems to us that the supervisors have little choice but to try to rely more – not less – on market discipline – augmented by more effective public disclosures – to carry an increasing share of the oversight load. This is, of course, only feasible for those, primarily large, banking organizations that rely on uninsured liabilities in a significant way. To be sure, these organizations already disclose a considerable volume of information to market participants, and, indeed, there is ample evidence that market discipline now plays a role in banking behavior. Nonetheless, the scale and clarity of disclosures is better at some institutions than at others and, on average, could be considerably improved. With more than a third of large-bank assets funded by non-insured liabilities, the potential for oversight through market discipline is significant, and success in this area may well reduce the need to rely on more stringent governmental supervision and regulation. The channels through which market discipline works are, of course, changes in access to funds and/or changes in risk premia as banks take on or shed risk or engage in certain types of transactions. The changing cost and availability of bank funding affect ex ante risk appetites of bank management and serve as market signals of a bank’s condition to market participants and to examiners. But the prerequisite to the enhancement of market discipline in conjunction with supervision and regulation is improvement in the amount and kind of public disclosure that uninsured claimants need about bank activities and on- and off-balance-sheet assets in order to make informed judgments and to act on those judgments. Information on loans by risk category and information on residual risk retained in securitization are examples. The best way to encourage more disclosures is not yet clear. Our intent is to consult with the industry regarding the establishment of new disclosure standards and ways to evaluate their application. Supervision Improved public disclosure will, we believe, not only enhance market discipline but also create further incentives for improvements in banks’ risk-management practices and technologies. Such improvements will enhance the supervisory pillar of prudential oversight. If supervisors are comfortable with a bank’s internal risk-management processes, the most cost-effective approach to prudential oversight would have supervisors tap into that bank’s internal risk assessments and other management information. To be sure, some “transaction testing” of risk-management systems by supervisors will necessarily remain. But as internal systems improve, the basic thrust of the examination process should shift from largely duplicating many activities already conducted within the bank to providing constructive feedback that the bank can use to enhance further the quality of its risk-management systems. It is these internal bank systems – coupled with public disclosure – that provide the first line of defense against undue risk-taking. Indeed, it should be emphasized that the focus of supervision and regulation – especially for the larger institutions – should be even less on detail and more on the overall structure and operation of risk-management systems. That is the most efficient way to address our interest in both the safety and soundness of the banking system and the overall stability of financial markets. Relying more extensively on banks’ internal risk-management systems can also be used to enhance prudential assessments of a bank’s capital adequacy. As the Basel consultative document suggests, over time our examination process for assessing bank capital adequacy would try to use the same techniques that banks are, and will be, using to evaluate their risk positions and the capital needed to support these risks. To spur this process in the United States, the Federal Reserve in June issued new examination guidance encouraging the largest and most complex banks to carry out self-assessments of their capital adequacy in relation to objective and quantifiable measures of risk. These self-assessments will be evaluated during on-site examinations and, eventually, are to be a factor in assigning supervisory ratings. A key component of these self-assessments will be each bank’s internal risk evaluations of the credit quality of its customers and counterparties. Currently, such internal risk ratings are beginning to be used by a small number of banks in their risk-management, pricing, internal economic capital allocation, and loan loss reserve determinations. Some banks are further along in the process than others. Virtually all large banks are moving in that direction. The bank regulators already have begun incorporating reviews of these internal risk-rating processes into their on-site examinations, and last July the Federal Reserve issued examination guidance on this subject, including a summary of emerging sound practices in this area. The supervisory policies and procedures being contemplated will build on the increasingly sophisticated management and control systems that are rapidly becoming part of banks’ best practice risk-management mechanisms. They will, as well, require both good judgment and sophistication on the part of bank examiners in order to avoid a cookie-cutter application of policies and to develop skills at evaluation that will match those available to the banks. For each of about thirty large, complex banking organizations – in Washington parlance, LCBOs – the Federal Reserve has already established dedicated teams of examiners, supplemented by experts in areas ranging from clearance and settlement to value-at-risk and credit-risk models. Each LCBO team is directed by a senior Reserve Bank official. Both that “central point of contact” and his or her supervisory team will be charged with following one LCBO and understanding its strategy, controls, and risk profile. Jointly, these teams will represent the Federal Reserve supervisory pillar as it applies to LCBOs. Minimum capital regulation In addition to emphasizing more effective market discipline and making supervisory assessments of bank capital adequacy more risk-focused, the June consultative paper highlights the need to make regulatory capital requirements – the third pillar of prudential oversight – more risk-focused as well. In recent years, it has become clear that the largely arbitrary treatment of risks within the current Basel Accord has distorted risk-management practices and encouraged massive regulatory capital arbitrage. That is, our rules have induced bank transactions that have the effect of reducing regulatory capital requirements more than they reduce a bank’s risk position. Consequently, the fundamental credibility of regulatory capital standards as a tool for prudential oversight and prompt corrective action at the largest banking organizations has been seriously undermined. In reflection of the considerable differences among banks that I mentioned earlier, US supervisors are developing proposals for a multi-track approach to address modifications to the regulatory capital rules, and this approach has been incorporated in the Basel consultative document. Within the United States, consideration is being given to a standardized capital treatment involving a quite simple regulatory capital ratio that might become applicable to the vast majority of institutions that are not internationally active. For another group of banks, change might involve such modest refinements to current capital requirements as closing certain loopholes and basing some risk-weights on available external credit ratings. For those comparatively few banking organizations whose scale, complexity and diversity warrant a more sophisticated approach to capital adequacy, the Basel Committee has proposed another track that, at least initially, would seek to link regulatory capital requirements to the banks’ internal risk ratings that I discussed earlier. Under this approach, the risk-weight assigned to a particular credit position would be based on the internal risk rating assigned by the bank holding that instrument. Regulatory staffs in the United States and other countries are currently attempting to work out the basic architecture of such an approach. Critical issues include how to validate banks’ internal risk ratings and how to link risk-weights to these internal ratings so as to ensure economically meaningful and reasonably consistent capital treatment of similar risks across banks. This is an extremely difficult undertaking, and its success will require unprecedented collaboration between – and among – supervisors and the banking industry. The framework It is, I believe, important to reiterate my earlier comment that bank supervision and regulation – especially capital regulation – are necessarily dynamic and evolutionary. We are striving for a framework whose underlying goals and broad strategies can remain relatively fixed, but within which changes in application can be made as both bankers and supervisors learn more, as banking practices change, and as individual banks grow and change their operations and risk-control techniques. Operationally, this means that we should not view innovations in supervision and regulation as one-off events. Rather, the underlying framework needs to be flexible and to embody a workable process by which modest improvements in supervision and regulation at the outset can be adjusted and further enhanced over time as experience and operational feasibility dictate. In particular, we should avoid mechanical or formulaic approaches that, whether intentionally or not, effectively “lock” us into particular technologies long after they become outmoded. We should be planning for the long pull, not developing near-term quick fixes. It is the framework that we must get right. The application might initially be bare-boned but over time become more sophisticated. For example, it could begin with a limited number of risk “buckets” and, over time, be expanded to include not only more risk categories, but also the use of an individual bank’s full credit-risk model – all within the same supervisory framework and unique to each bank. The design of the improved oversight approach is a work in progress. We are endeavoring to develop a program that is the least intrusive, most market based, and most consistent with current and future sound risk-management practices possible, given our responsibilities for financial market stability.
|
board of governors of the federal reserve system
| 1,999 | 10 |
Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the Department of Finance Lecture Series, University of Missouri, Columbia, Missouri on 12 October 1999.
|
Mr Meyer discusses the new challenges for monetary policy identified at the Jackson Hole conference Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the Department of Finance Lecture Series, University of Missouri, Columbia, Missouri on 12 October 1999. * * * Each year in late August, for the last 23 years, the Federal Reserve Bank of Kansas City has hosted a conference where central bankers from around the world tackle issues relevant to the setting and implementation of monetary policy. They are joined by academics, former Federal Reserve governors, and private-sector economists. A core group returns every year and contributes a particularly strong sense of camaraderie. The lure of the conference is the extraordinary combination of an excellent choice of topics, a high level of formal presentations and discussions, the opportunity to engage in a series of more informal conversations over coffee, at meals, and on hikes, and to do all this in the shadow of the Tetons. I told Tom Hoenig, the President of the Kansas City Fed, that when I was considering whether or not to accept the nomination to join the Board of Governors, the deciding factor was the prospect that, by doing so, I would receive a lifetime perk of an invitation to the Jackson Hole Conference. The themes are always interesting, but I found this year’s conference particularly stimulating – so much so that I decided to use the presentations and discussions at Jackson Hole as the organizing framework for today’s lecture. The title of this year’s conference was “New Challenges for Monetary Policy.” The papers described both the emerging consensus about the objectives and strategies underlying the conduct of monetary policy and highlighted a number of new challenges. In the first section of today’s lecture I will outline the framework and identify the objectives of monetary policy, describe the operational procedures used to conduct monetary policy, and discuss the strategy for carrying out monetary policy to achieve those objectives. A number of participants at the conference suggested that there was an emerging consensus about the appropriate strategy – specifically, flexible inflation targeting. Then I will turn to the new challenges for monetary policy identified at the conference. Each is illustrated by current or recent experience around the world. First, does a low-inflation environment – and specifically the possibility that nominal interest rates might decline to zero in such an environment – constrain the effectiveness of monetary policy, and if so, how can monetary policy adjust to maintain its effectiveness? Second, how should monetary policy respond to movements in asset prices in general and, specifically, to the possibility of asset-market bubbles? Third, what is the best option for exchange rate and monetary policy regimes for small open emerging market countries, in light of increased globalization and especially larger and more volatile international capital flows? The basic theme of the conference was that flexible inflation targeting provides a constructive response to each of these challenges. Careful design and implementation of the framework, including the choice of an inflation target, would, for example, reduce the prospect that monetary policy might lose its ability to stimulate the economy further because nominal short-term interest rates had fallen to zero. Similarly, following such a disciplined monetary policy would mitigate, though not entirely eliminate, the effect of asset bubbles. Finally, countries that move to a flexible exchange rate regime – as many have done recently – need to put in place a disciplined monetary framework such as is offered by this approach. I. The monetary policy framework Let me start this discussion by identifying what the objectives of monetary policy are, and should be, and then discuss what strategies are useful in achieving the objectives. A. Objectives of monetary policy One of the themes of the conference was that there is an emerging consensus about the objectives of monetary policy, one that has been reflected in the conduct of monetary policy in many countries for some time and now is finding its way into the rhetoric of policymakers. I would describe the consensus as an acceptance of a dual mandate for monetary policy. Monetary policy seeks first to achieve and maintain price stability over the longer run and, second, to retain the flexibility to dampen cyclical fluctuations in the economy around full employment. That is, I suspect, a sharper statement than many (and indeed most) central banks today feel comfortable with. At the conference, the consensus was described as “flexible inflation targeting.” Most central banks want to emphasize, with good reason, their price stability objective. This reflects a couple of strongly held views. First, monetary policy, in the long run, principally affects nominal variables such as nominal income, the price level, and the rate of inflation, but has lesser effects on real variables – such as the level of employment or the level or growth rate of output. I expect all central bankers would agree that an environment of price stability offers the best contribution that monetary policy can make to the level and growth of output because it eliminates distortions to resource allocation and disincentives to saving and investment associated with high and variable inflation rates. It follows that a more accommodative monetary policy cannot foster a higher average rate of real growth. The second strongly held view is that because monetary policy is the principal determinant of the inflation rate in the long run, central banks have a responsibility to set an appropriate target for long-run inflation and achieve it. And that target should be price stability or, at the least, a low rate of inflation. While monetary policy cannot raise the level or rate of growth of output over the long run through any means other than maintaining price stability, it is widely, though not universally, accepted that monetary policy can affect the level and growth rate of output in the short run and, perhaps, therefore contribute to smoothing out fluctuations in the economy around full employment. This is sometimes referred to as the short-run stabilization objective for monetary policy. A central issue for monetary policy is how to balance the dual objectives of price stability and output stabilization and how explicit to be about the commitment to these dual objectives. Both theoretical and, especially, empirical macroeconomics have established the existence of an inescapable trade-off affecting the conduct of monetary policy. The trade-off is between the variability of inflation around its target (zero or some low rate) and the variability of output around its target (the full-employment level of output or potential output). Autonomous increases or decreases in aggregate spending push output and inflation in the same direction relative to their targets and, therefore, do not bring this trade-off into play. But supply shocks – such as increases or decreases in oil or food prices – drive output and inflation in opposite directions relative to their respective targets. The more quickly monetary policy reacts to restore inflation to its target following a supply shock, the greater will be the variability in output relative to its target. The reason for this trade-off is that monetary policy affects inflation primarily through its initial effect on the amount of slack in the economy. Tightening monetary policy slows spending growth, opens up some slack temporarily in labor and product markets, and allows the slack to reduce inflation. Once inflation has returned to its target, policy can guide the economy back to full employment. It probably takes a certain amount of slack over time to reduce inflation by a given amount, but reducing inflation rapidly means opening up an especially large output gap for a short period – hence the trade-off between inflation and output stability. Several countries have moved to inflation-targeting regimes over the last decade or so, setting a numerical target for inflation. This was generally part of a process of shifting responsibility for monetary policy from finance ministries to independent central banks and often followed a period of poor macroeconomic performance, especially high and variable inflation. The newly independent central banks often identified inflation as the singular objective of monetary policy to gain credibility and facilitate the transition to price stability. In addition, the government wanted to ensure accountability of the central bank and hence often opted for a narrow and explicit objective. With price stability now largely accomplished, some of these central banks are becoming more flexible in their approaches to monetary policy by recognizing a role for short-run stabilization. B. How explicit should the objectives be? In the United States, Congress has set the objectives for monetary policy in the Federal Reserve Act, as amended in 1977. The objectives are maximum employment and stable prices, which are mutually consistent and achievable if maximum employment is interpreted as the maximum employment sustainable without rising or falling inflation. This is an explicit expression of a dual mandate. Inflation-targeting countries typically have goals of about 2% to 2½% for inflation and sometimes establish a range for inflation, for example, 1% to 3%. New Zealand, Canada, Australia, and the United Kingdom are examples of countries with explicit numerical targets, and there are many others. It was widely agreed at Jackson Hole that the United States has, without an explicit target, achieved the same success in reducing inflation as countries with explicit numerical targets. I will not be considering the pros and cons of an explicit numerical target for inflation today, although I recognize this is an important question and one that warrants further discussion. C. How should policy be conducted to achieve the objectives? Once the objectives of policy are set, a central bank must choose an operating regime and then develop a strategy for using its instruments to achieve the objectives. Virtually all central banks carry out monetary policy operations by influencing – in effect setting – some short-term nominal interest rate, typically the rate on overnight inter-bank loans. The FOMC at each meeting sets a target for the federal funds rate and instructs the Manager of the Open Market Desk to achieve that target over the intermeeting period by buying or selling securities. By adjusting this single rate, the Federal Reserve affects the broader array of interest rates and asset prices in the economy and, in turn, affects aggregate demand, the level of real economic activity, and inflation. There are two ways of describing the strategy for monetary policy. One focuses on “instrument rules,” which describe how the policy instrument – in this case a short-term interest rate – should be moved in response to economic developments. Such a rule was designed by Professor John Taylor of Stanford University. The Taylor Rule describes how the federal funds rate should be adjusted in response to movements in output relative to its long-run sustainable level and to movements in inflation relative to its target. The Taylor Rule thus explicitly embodies the dual objectives of monetary policy and is a form of flexible inflation targeting. In practice, no one expects monetary policy to be conducted according to a rigid rule. Such rules can, however, be useful in informing policy decisions and helping policymakers calibrate their responses to changes in utilization and inflation rates. Moreover, research on how such rules affect the quality of macroeconomic performance can aid policymakers in arriving at their decisions. A second strategy is to move the instrument in response to the inflation forecast. In this approach, the policymakers start with a forecast of inflation over some interval, typically about two years. Policy is then set over this interval to achieve the price stability target by the end of the period. The interval chosen to reach the inflation target takes into account the fact that the more rapid the return to the target, the greater the variability in output relative to its target. The time interval is thus a vehicle for allowing policymakers to damp movements in output around full employment and at the same time ensure that the inflation objective is eventually achieved. Inflation forecast targeting has the advantage of being explicitly forward-looking. The Taylor Rule, in contrast, appears to be backward-looking, though the contrast is not as clear in practice as it might appear. Forecasts are exercises in processing information about current and past developments to yield anticipated future outcomes. The Taylor Rule takes explicit account of only very recent observations on inflation and output, but these are, to be sure, important determinants of future inflation. An inflation forecast approach allows policymakers to consider a wider range of current and past information in projecting future outcomes. D. Setting the inflation target Setting the inflation target to be consistent with price stability sounds straightforward, but an important theme at this summer’s conference was the variety of options and implications of this choice. The obvious choice would be a target of zero for inflation, taking into account biases in published price measures. But an intriguing alternative is to set a target for the price level. If a disturbance results in a period of inflation, under a zero inflation target, the objective is to return to a zero inflation rate. When inflation has been returned to zero, however, the price level will be higher than it was before the disturbance. Under a constant price level target, the aim is to return to the initial price level, requiring a period of deflation to offset the effect of the period of higher inflation. This can produce a more predictable price level in the long run, but many analysts are concerned about how the economy would respond to a period of deflation. Yet another alternative is to target a low positive inflation rate – specifically an inflation rate somewhat above a level that reflected estimates of the bias in published measures. One of the issues I will discuss shortly is whether targeting a low, positive rate of “true” inflation (what I call price stability plus cushion) might result in better cyclical performance of the economy than a zero inflation target. Still another choice is an average inflation target. If the target was 2% and inflation temporarily moved to 3%, an average inflation target policy would encourage a decline in the inflation rate to below 2% for a while, moving the average back to 2%. This allows for a predictable (albeit rising) long-run price level, while avoiding the deflationary episodes that would occasionally be called for under a constant price level target. II. Monetary policy in a low-inflation environment I turn now to the challenges associated with the conduct of monetary policy in a low-inflation environment. The issue here is whether, at very low inflation rates, the cyclical performance of the economy would deteriorate. If this were the case, the objectives or strategy of monetary policy would need to be adjusted. Among the possible responses is adjusting the definition of the inflation target, so I will be building on the preceding discussion. A. Keynes’ liquidity trap and the zero nominal bound problem John Maynard Keynes, in his classic work, The General Theory of Employment, Interest and Money, warned that monetary policy might become ineffective once interest rates fell to some low level at which wealth owners might become indifferent as to whether they held money or bonds. In the language of economists, money and bonds might become perfect substitutes. In this case, it would be impossible for monetary policy to affect interest rates by affecting the composition of portfolios, specifically the amount of money held relative to bonds. Keynes called this situation a “liquidity trap.” Since the end of the Great Depression, many have interpreted Keynes’ liquidity trap to be a theoretical curiosity rather than a practical problem likely to confront policymakers. But with short-term rates now at zero in Japan and low inflation almost everywhere in the industrialized world, the problem is taken more seriously by central banks – to the point that it was one of the topics at Jackson Hole. Keynes’ views on the liquidity trap have, in my view, often been misunderstood. Keynes understood that central banks could push rates on short-term government debt to zero. The liquidity trap, as Keynes used the term, is better thought of as a term-structure trap or, more generally, a limit on how low long-term and private interest rates can go once the interest rate on short-term government debt is pushed to zero. When short-term government rates reached zero, Keynes believed that there would still be positive interest rates on both longer-term government securities and private debt and that monetary policy then would be unable to push those rates any lower. The conventional view is that the level of long-term rates is determined by current and expected shortterm rates. Given that market participants are unlikely to expect that zero short-term rates will be sustained for 20 or 30 years, the maturity of long-term bonds, rates on long-term securities will remain positive when short-term rates reach zero. Stated somewhat differently, shocks that would otherwise lower short-term rates cannot do so at the zero bound, while shocks that would raise short-term rates still will do so. In addition, private rates differ from government rates by an amount that reflects the risk of default on private debt, assuming that government debt is viewed as being default free. Even if the rate on government debt reaches zero, therefore, private debt will still carry positive rates. The liquidity trap is sometimes referred to as the problem of the zero bound on nominal interest rates. Nominal interest rates cannot be negative, because, in this case, everyone would want to hold cash. Consequently, an environment of very low inflation would constrain how low monetary policy could push real interest rates in response to a recession and, therefore, be associated with less-favorable cyclical performance of the economy. If inflation were 2%, for example, monetary policy, by driving the nominal interest rate to zero, could push real interest rates to minus 2%. If prices were stable, on the other hand, the limit on the real interest rate would be zero, and this limit might constrain the ability of monetary policy to offset downward shocks to the economy. B. Nominal rigidities Another long-standing explanation for why low inflation might result in a deterioration in macroeconomic performance is the possible existence of nominal wage rigidities – specifically, a reluctance to reduce nominal wages. Relative wage movements are important signals and incentives that guide labor resources to their most highly valued uses. When inflation is very low, achieving this variation in relative real wages depends on some wages falling. If nominal wage cuts are rare, efficiency in the allocation of resources may decline, and as a result, output might be lower at price stability than if there were some low rate of inflation. And in the absence of declines in nominal wages for some workers, average real wages will be higher, and hence average employment lower, at price stability. Although there is some evidence of downward nominal wage rigidity, there is no evidence that this has an effect on aggregate wage and price inflation or the natural rate of unemployment in the postwar period – even when inflation has been very low. In addition, it is not clear how much rigidity would remain if we achieved and sustained steady low inflation. C. Japan’s current experience As I noted earlier, Japan presents a laboratory for observing an economy with low inflation. Japan enjoyed effective price stability through most of the 1980s. In the 1990s it was hit with a number of adverse shocks, from bursting asset bubbles and associated banking system problems early in the decade to the financial meltdown of its Asian trading partners in late 1997. In continuing its efforts to move the economy toward recovery, the Bank of Japan last February lowered short-term interest rates to almost zero. Although Japan registered surprisingly robust growth in the first half, most observers see private-sector demand as still quite weak despite the low short-term rates. Nevertheless, that the Bank of Japan has apparently exhausted its ability to stimulate the economy through conventional policy. This raises two questions. How could monetary policymakers have avoided this predicament, and once they were in it were there unconventional forms of monetary policy that would have permitted them to provide additional stimulus to demand? D. How to avoid the problem At the Jackson Hole conference, Mervyn King and Lars Svensson argued that a flexible inflation-targeting regime would help policymakers avoid this problem in the first place. While King in particular had some doubts about the zero nominal interest rate bound and especially about the existence of nominal rigidities, both he and Svensson noted that opting for a positive inflation rate as a target was a prudent way of avoiding testing either of these hypotheses. Indeed, they both noted that inflation-targeting central banks virtually always opt for inflation targets greater than zero and greater than estimates of the inflation bias in published measures of inflation. Recent research suggests that even a cushion of 1 percentage point (above an amount equal to the expected bias in inflation measures) can go a long way towards avoiding the problem of deteriorating cyclical performance at low inflation rates. The second key to avoiding this problem is to have a symmetrical inflation target. This means one that evokes an aggressive response to both falling below and to rising above the inflation target. One could take this further. Monetary policymakers can always choke-off inflation by raising real interest rates, because there is no limit to how high real interest rates can be pushed. There is a limit, however, to how low real interest rates can decline, given the zero nominal interest rate bound. Therefore, to the degree that any asymmetry is called for, it might be to move more quickly and more decisively with respect to downward than to upward disturbances to aggregate demand, at least when beginning from already low nominal interest rates. This allows policymakers to substitute speed for the magnitude of decline when responding to downside shocks. A third possibility is that the zero nominal bound problem can be reduced or eliminated either by a credible price level target or by an average inflation target. If the price level falls in response to downside shocks, a price level target will imply that monetary policy will move more aggressively to stimulate the economy, once demand recovers and monetary policy has regained its effectiveness, than would have been the case with a traditional inflation target. This would ensure that a period of deflation will be followed by a corresponding period of inflation. Assuming bondholders take into account this more aggressive stimulus, bondholders will project that zero nominal short-term interest rates will be maintained longer than would otherwise be the case, justifying lower long-term interest rates today. A similar result could be achieved by an average inflation target. If inflation was zero for a while, bondholders would project a period of inflation above the long-run inflation target – for example, 3% or 4% instead of 2% – until the average inflation rate returned to 2%. This would lead to expectations that short-term interest rates would remain low for a longer period and contribute to a decline in real long-term rates today. E. What to do if nominal rates fall to zero? If a target for price level, positive inflation rate, or average inflation rate had not been implemented and made credible before a central bank was confronted by zero nominal interest rates, the central bank could, of course, introduce them at that time. However, such a move might lack credibility. It might be seen as an emergency program that might not be sustained once the economy recovered and moved away from the zero nominal bound. Moreover, with prices falling and the economy in recession, one could imagine a good deal of skepticism about the ability of the central bank to meet its objective. Paul Krugman has urged the Bank of Japan to move to a positive inflation target as a way of lowering real interest rates and stimulating the economy. The Bank of Japan has resisted, arguing that, given its inability to increase aggregate demand, there would be little credibility in setting a positive inflation target. Even if the Bank of Japan today cannot expect to stimulate demand and thereby raise inflation, they will almost surely have this opportunity well before today’s long-term bonds mature. They could therefore commit today to maintaining a positive inflation rate when it becomes possible and thereby raise inflation expectations today and lower real interest rates. However, such a distant increase in inflation might not have much impact on the real cost of borrowing today, so I continue to be skeptical that initiating an inflation targeting approach, once confronted by the zero nominal bound, offers a reliable way out of the zero nominal bound problem. A second approach would be to undertake unconventional monetary policy operations. Conventional monetary policy is implemented, as I described, by employing open market operations concentrated in repurchase agreements or in the short end of the government debt market. An alternative approach, sometimes referred to as a monetization strategy, focuses on increasing the money supply rather than on the level of short-term interest rates. At Jackson Hole, Allan Meltzer offered a clear framework for the way such a policy direction might allow additional stimulus after conventional operations had lowered nominal short-term interest rates to zero. In the typical model, money and bonds become perfect substitutes at some low interest rate, perhaps zero, and we have a liquidity trap where monetary policy loses its power to add further stimulus by lowering interest rates on bonds. Meltzer suggested this result reflects more the limits of the typical model than the limits on monetary policy. He suggested that in a more realistic model incorporating multiple assets – for example, long-term as well as short-term government bonds, private as well as government debt, and equities as well as bonds and money – there are two ways in which the economy can escape from the apparent liquidity trap. In an activist approach, monetary policy would increase the sum of the money supply and short-term government debt – the assets that have become perfect substitutes – by widening the scope of open market operations to include purchases of long-term government debt and perhaps private debt and foreign exchange. Such operations might lower interest rates on long-term and private securities and/or result in a depreciation of the currency, in all cases stimulating aggregate demand. A more passive approach would allow deflation to raise the real value of the sum of money balances and short-term debt. This will be the outcome of deflation as long as the central bank does not let the nominal money supply decline as the price level falls. An increase in the real money supply would then result in increased purchases of a wide array of financial assets, including longer-term government bonds and private debt and perhaps even equities. The net result will be lower interest rates on long-term and private securities and higher values of equities that, in turn, will stimulate spending, over and above the stimulus that results from the wealth effect associated with increased real money balances. This analysis raises an interesting set of questions about which reasonable people can disagree. Theory would seem to leave open the possibility that such wider operations might provide incremental stimulus, but I read the empirical evidence and historical experience as raising doubts about the effectiveness of such actions. Important questions relate to both the theoretical structure of asset demands and empirical evidence about portfolio behavior and asset markets. The issue of whether relative supply of short and long-term bonds affects the term structure of interest rates is crucial. The most widely accepted theory of the term structure, called the pure expectations model, holds that long-term interest rates are a weighted average of current and expected future short-term rates. This approach leaves no room for relative supply effects and is consistent with the term structure trap that I have associated with Keynes. There is, however, a competing theoretical model, often referred to as the market segmentation approach, which allows for the effect of relative supplies. I have never given much weight to the role of relative supply effects in affecting the term structure or exchange rates, given the failure of empirical studies to find much evidence of such effects. Of course, at the extreme, the Bank of Japan could set the price and hence interest rate on long-term bonds if it was prepared to take all these assets onto its balance sheet. But such operations almost certainly blur the distinction between monetary and fiscal policies. Another reason for skepticism is that the domestic channel through which monetary policy works in Japan operates very importantly through the banking system. The continuing banking sector problems suggest that this channel remains weak, if not inoperative. Specifically, if additional reserves were injected into the banking system, a larger share of them would likely be held as excess reserves rather than to be lent out. In this case, attention shifts to the effect of monetization on exchange rates and to the recommendation that the Bank of Japan raise the money supply by purchasing foreign currency, an operation sometimes referred to as unsterilized exchange rate intervention. There is a case in which unsterilized intervention has a more powerful effect on exchange rates than sterilized intervention (where the central bank absorbs any reverses introduced as part of exchange rate intervention). But this incremental effect arises because unsterilized intervention is expected to lower the country’s interest rates and the lower interest rates would, in turn, put downward pressure on the country’s exchange rate. If the interest rate channel does not operate because of a liquidity trap, this could raise questions about the effect of unsterilized foreign exchange intervention. Even in this context, however, there may be some positive effects of unsterilized intervention. This operation, like open market operations conducted in long-term securities, is a way of raising the sum of money and short-term government securities. As portfolios are rebalanced, the increase in the money supply may result in purchases of long-term government securities, private securities, and foreign currency denominated assets. This could affect a range of interest rates and the exchange rate. Though once again there is a question about the degree to which relative supplies of assets have an important bearing on relative rates of return. Moreover, such a strategy faces other obstacles. First, in Japan, foreign exchange operations are at the discretion of the Ministry of Finance. Therefore, implementing a monetization strategy in this way would appear to shift the decision about the timing and magnitude of monetary policy from the newly independent central bank back to the Ministry of Finance. Second, pursuing a stimulus program focused on yen depreciation might exacerbate tensions related to the already wide current account imbalances in Japan and the United States as well as possibly interfere with the recoveries under way among Japan’s Asian trading partners. So I remain skeptical that there is much leverage in the monetization approach. Nevertheless, if the Japanese economy fails to respond to the policies now in place, one could argue for some experimentation in this direction, given the absence of other options for monetary policy. Finally, in cases of a nominal interest rate bound, fiscal policy could and should carry more of the stabilization burden, as has been the case in Japan recently. III. Asset market bubbles and monetary policy Let me move to the challenge of how monetary policy should respond to suspicions of asset market bubbles. An asset market bubble refers to an extended increase in the price of assets not justified by the fundamentals. Such movements might be associated with waves of optimism or pessimism that become self-perpetuating. There is not a complete agreement as to the usefulness of the concept of asset bubbles. I do find it plausible that market prices might sometimes, and for some period, depart from values that are justified by fundamental forces. Over longer periods, markets will converge back to fundamental value. However, when large departures occur, there is potential for a sharp correction that, in turn, can have damaging consequences for the real economy. There are two asset markets that are of special importance. The first is the equity market and the second the market for real property – land and buildings. The fundamental value of a stock is the present value of the expected earnings stream of the firm in question, derived by applying a discount factor that accounts for both the interest rate on safe assets and a risk factor appropriate to the uncertainties about the expected future earnings stream. The fundamentals underpinning the price of equities are therefore the expected increase in earnings and the discount rate that transforms expected earnings into a price for the asset. Equity prices could rise above their fundamental value if investors hold unrealistic expectations about earnings growth, if they assume that the earnings stream is more stable than it will turn out to be, or if they otherwise believe there is less risk associated with holding the equities than turns out to be the case. In these cases, reality will at some point disappoint relative to expectations and force a reappraisal of the value of equities. A similar process underlies the price of real property. Monetary policy, as I emphasized earlier, focuses on price stability and damping fluctuations around full employment. Should it also focus on encouraging asset prices to return towards perceived fundamental value, if asset prices appear to depart significantly from policymakers’ perception of that fundamental value? That is another question tackled at the Jackson Hole conference. It is one motivated by at least two recent experiences. First, the Japanese economy is often described today as suffering from a burst in an asset bubble. During the 1980s, the Japanese economy registered very strong growth, low inflation, and soaring equity and property prices. In 1989, following a tightening of monetary policy, there was a sharp collapse in asset prices. Equity prices fell by 60% and urban land values by more than 75%. In addition to direct effects via the decline in wealth on consumer spending, the collapse of asset prices had a devastating effect on the banking system. Real property dominates the collateral underlying many of the loans of most banking systems. A collapse in real property values, therefore, leaves most loans without adequate collateral support. In addition, in Japan, the banks hold considerable equities in their portfolios. Japanese banks therefore suffered a double blow in the collapse of equity and property prices. As a result, with the capital of the banking system severely depleted, banks had to restrict their lending, leading to a severe credit crunch that added to the forces depressing the Japanese economy. The second experience hits closer to home. Many have viewed the surge in equity prices in the United States over the past four years as evidence of a bubble. The Economist magazine is a leading proponent of this view, and many others subscribe in varying degrees to this characterization. It is true that the rise in equity prices – averaging 25% to 30% a year over the last four years – is unprecedented and that current values challenge previous valuation standards. But one could argue that structural changes in the economy have raised the sustainable level and growth of earnings and lowered the volatility of earnings or otherwise reduced the perceived risk in equities. Such structural changes could, in principle, justify at least a substantial portion of the rise in equity prices. But the question at issue here is whether policymakers should substitute their judgment about fundamental value for the market’s assessment and use monetary policy to encourage a convergence back to their own estimate of fundamental value. The paper by Bernanke and Gertler at the Jackson Hole conference addressed this question. They used a methodology that has proved valuable in studying a number of other questions related to the strategy of monetary policy. They first construct a small model of the US economy and then subject this model to a series of disturbances that reflect the economy’s historical experience. They observe the resulting variability of inflation and output relative to their respective targets. The base model includes a policy rule according to which short-term interest rates are adjusted in response to economic developments. Bernanke and Gertler examine whether an attempt by policymakers to return equity prices towards their estimate of fundamental value improves macroeconomic performance, judged in terms of inflation variability and output variability. Confidence in their conclusion is, of course, affected by how well one believes the model captures the performance of the economy. Nevertheless, their methodology is well designed and it is worth considering their conclusions. They find that policymakers cannot improve the outcomes by responding directly to suspected deviations of equity prices from fundamentals, but that a policy focused on achieving price stability and damping fluctuations around full employment will mitigate the adverse consequences of equity market bubbles. That is, a monetary policy focused on price stability and output stabilization will respond to the effects of higher equity prices on aggregate demand, real economic activity, and inflation. This will generally dampen movements in equity prices, while contributing to meeting the broader macroeconomic objectives of monetary policy. However, given the difficulty in distinguishing between changes in asset prices dominated by fundamental forces and those driven by non-fundamental forces, policymakers should not target asset prices or try to guide them to the policymakers’ estimate of fundamental value. The discussion of the paper by Rudy Dornbusch and comments by Federal Reserve Chairman Alan Greenspan added an important additional theme related to monetary policy and equity prices. Dornbusch took note of the setting, pointing out that the two sides of the Teton Mountains are dramatically different. One side slopes downwards gradually and gracefully. The other side drops off quite precipitously. So it is with equity prices. On the way up, they typically move gradually. While they sometimes also move downward gradually, downward movements are occasionally steeper and more discontinuous than upward movements. Monetary policymakers sometimes face additional problems in the case of such steep declines in asset values. Credit markets may become extremely illiquid and even fail to operate for a period. It is not simply that interest rates on private securities rise, but that virtually all buying and selling may temporarily cease. This can create extreme problems for those who rely on short-term financing and, in the extreme, the resulting financial distress can threaten the solvency of some financial institutions. In such situations, monetary policy typically intervenes to provide liquidity, until markets recover and begin to operate more normally. Because of this, it is sometimes alleged that monetary policy stands ready to intervene to protect market prices in a downturn. Chairman Greenspan commented that markets are asymmetric, not monetary policy. He emphasized that monetary policy does not operate with a target for equity prices when they are falling any more than it does when equity prices are rising. In both cases, monetary policy responds only indirectly to equity prices, by taking equity prices into account in the assessment of aggregate demand. But monetary policy has to respond quickly to the special circumstances that accompany a collapse of asset values, specifically the extreme illiquidity and seizing up of credit markets. This occurred both in 1987 and, more recently, in the fall of 1998. IV. Globalization and monetary policy: choosing exchange rate and monetary policy regimes The world economy has become increasingly globalized over the last couple of decades, measured both by the flow of trade among countries and especially by the flow of international capital. An important challenge facing central banks around the world is how this globalization has affected their ability to pursue domestic objectives with monetary policy and, indeed, whether it is even possible to preserve an independent monetary policy. The freedom to pursue an independent monetary policy will be determined, to an important degree, by the choice of exchange rate regime. A government that pegs its exchange rate to another country, for example, gives up its ability to pursue an independent monetary policy. Its interest rates must be set to support the fixed exchange rate and will generally move with the interest rate in the country to which it is pegged. Countries pegged to the dollar, in effect, are tied to the monetary policy pursued by the United States. Such regimes are particularly effective ways to make a transition from hyperinflation to the low inflation rate of the country to which the currency is pegged. For example, if the country has no history of an independent central bank successfully achieving low inflation, the country might be better off abandoning the attempt at independent monetary policy and buying into another country’s monetary policy and inflation outcomes. This is precisely the decision made by Argentina, and it has contributed to maintaining low inflation, following the transition from hyperinflation that had been achieved just prior to its decision to fix its exchange rate to the dollar. It follows that if a country wants to have an independent monetary policy, it must choose a flexible exchange rate regime and if it chooses a flexible exchange rate regime it must complement it with a disciplined monetary policy. Many countries pursuing this course have opted for flexible inflation targeting. But, under any exchange rate regime, small open economies in general and emerging market economies in particular are challenged by volatile international capital flows. The challenge under an adjustable peg – a regime in which the exchange rate is fixed at any point in time but can be adjusted over time – is particularly severe. If investors believe that a currency is overvalued, they will engage in transactions – such as purchasing assets denominated in other currencies or selling short the domestic currency – that will pay off if the currency is devalued. These very transactions will make it difficult for the country to sustain its current exchange rate. For a while, the country may sustain its current exchange rate by buying its currency with dollar reserves at the fixed exchange rate and raising its interest rates. But, depending on the size of capital flows, official reserves could be quickly depleted, forcing the country to abandon the peg altogether and float its currency. In addition, the higher interest rates used to defend the exchange rate may threaten a sharp decline in the economy and a collapse of the banking system. When this happens, currency values and equity prices often plunge below appropriate levels, with resulting adverse consequences to the real economy. The conventional wisdom today is that small open economies face a choice of one of the extremes – either a flexible exchange rate regime complemented by a disciplined monetary policy or a very fixed exchange rate regime, characterized by a currency board or by adopting some other country’s currency, as in dollarization. A currency board is an arrangement whereby the domestic currency of a country is required to be fully backed by reserves held in some other country’s currency, such as dollars. Hong Kong and Argentina have currency boards. If global investors attack such a currency, the use of official reserves to support the currency depletes reserves and requires a corresponding decline in the supply of the domestic currency. This automatically pushes up domestic interest rates to support the currency. The value of the currency board is that it puts domestic monetary policy on automatic pilot, and guarantees that policy will move aggressively to support the fixed exchange rate when it is under attack. The markets no longer have to worry about the willingness of the policy authorities to adjust interest rates aggressively enough to support the currency. Dollarization – which I will use broadly to refer to the strategy of adopting some other country’s currency – takes the currency board one step further. Under a currency board, the threat remains that the government will abandon this arrangement and devalue the currency or let it float. Dollarization increases the commitment of a country to a fixed exchange rate. Under dollarization, a country uses dollars for its domestic currency. It therefore faces dollar interest rates, although these rates will not necessarily be the same as those prevailing in the United States. Once again it has given up independent monetary policy. The advantage of this regime is that it might reduce risk premia that remain in domestic interest rates under a currency board that reflects the risk that the currency board might be abandoned. Of course, a country could reverse dollarization as well, but the costs of such a move would be very great. At the Jackson Hole conference, Eichengreen and Hausmann presented a discussion of the choice between flexible and very fixed exchange rates for small open economies. They suggested that a problem that besets many such economies is that, because of weak institutions and a failure to pursue sound policies, neither the government nor private citizens can borrow long-term or abroad in their domestic currency. The result is dangerous portfolio mismatches – long-term projects are financed with short-term debt and/or domestic projects are financed with foreign currency loans. In either case, the government and private citizens are subjected to the risks of unexpected changes in short-term interest rates and/or to the risks of a change in exchange rates. A currency board or dollarization arrangement, in such a case, might reduce the risks associated with such mismatches. As a result, the country might be able to reduce its risk premium, lower its vulnerability, and increase its access to long-term and foreign finance. In my view, the underlying problem, however, is often the mismatch between the speed with which an emerging market economy participates in the global economy and the speed with which its institutions and policies adapt to global norms. The best choice over time would appear to be to develop robust domestic institutions and pursue sound policies, including a disciplined monetary policy, and adopt a flexible exchange rate regime. An increased reliance on foreign direct investment relative to short-term portfolio capital might also be desirable. The real question is how to get there from where many emerging market economies find themselves today. There are, I believe, many advantages to a flexible exchange rate regime. It avoids the problem of choosing the right level at which to fix the exchange rate. It allows exchange rates to move in response to shocks or structural trends, alleviating the need for other aspects of the economy – such as domestic demand or the level of wages and prices – to carry the burden of adjustment. Floating exchange rates also serve as indicators of investor confidence, providing feedback to policymakers as to whether they are pursuing appropriate policies. Floating and perhaps volatile exchange rates also remind both borrowers and lenders of the risks inherent in international finance and may militate against the development of bubbles and excessive capital flows. Finally, where the monetary authority is sufficiently credible and disciplined, floating exchange rates allow for independent and perhaps countercyclical monetary policy. And, it is worth pointing out some of the downside risks associated with currency boards or dollarization. Either a currency board or dollarization requires a strong banking system because under these regimes governments lose their ability to print money and act as a lender of last resort. Many developing countries fail to meet this prerequisite. In addition, dollarization would not completely eliminate risk premia, because debt repayment problems are certainly possible in fully dollarized economies. So, an important issue is how much of the prevailing risk premia faced by small open emerging market economies is due to exchange rate risk and how much to other considerations. Finally, lower risk premiums could have the perverse effect of alleviating pressure on governments to pursue structural reforms that would lead to a more lasting improvement in the economy’s performance. On balance, I continue to lean towards flexible exchange rate regimes, but now better appreciate that there could be circumstances favorable to very fixed exchange rate regimes, including as a transition to flexible exchange rates, once the credibility of a country’s economic policies and institutions is sufficiently developed. V. What I learned from the Jackson Hole Conference I have presented today a short course that might be called the “Jackson Hole Seminar.” As any professor will tell you, the test of a seminar is what the students learn. But handing out a test would not be a pleasant way for a visitor to conclude his visit to your campus. So I will end with some comments on what I learned at Jackson Hole. 1. There is an emerging consensus towards flexible inflation targeting. Some central banks are more transparent about the dual objectives and some are more explicit about the inflation target, but there is a broad agreement about what the targets should be. There is somewhat less agreement about how monetary policy should be conducted to achieve the targets, but some convergence here as well. 2. While there are some intriguing new ideas about price level or average inflation targets, the consensus based on practice and recent performance around the world is that a low, but positive inflation target remains prudent. I refer to this target as price stability plus a cushion. The cushion mitigates the risk that monetary policy might lose its ability to provide further stimulus before it was able to adequately dampen the effect of downward shocks to the economy. 3. The conference offered a better understanding of how a monetization option might allow additional stimulus once monetary policy had pushed the nominal interest rate on short-term government debt to zero. But it did not offer much confidence, to me at least, that monetization is an effective way out of the current predicament in Japan or that initiating an inflation target, once having encountered this problem, would be effective. 4. The conference provided some support to the conventional wisdom – at least the conventional wisdom inside the Federal Reserve – about how monetary policy should or should not respond to suspected asset market bubbles. 5. The conference also provided a nicely balanced assessment of the choice between the extreme solutions for exchange rate regimes – that is, between flexible and very fixed exchange rate regimes. While the discussion clarified the problems and choices, it still left me still leaning towards flexible exchange rates.
|
board of governors of the federal reserve system
| 1,999 | 10 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before a conference sponsored by the Office of the Comptroller of the Currency, held in Washington, D.C. on 14 October 1999.
|
Mr Greenspan examines the sources of financial risk and the challenges faced by risk managers Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before a conference sponsored by the Office of the Comptroller of the Currency, held in Washington, D.C. on 14 October 1999. * * * One of the broad issues that you have been discussing today is the nature of financial risk. This evening I will offer my perspective on the fundamental sources of financial risk and the value added of banks and other financial intermediaries. Then, from that perspective, I will delve into some of the pitfalls inherent in risk-management models and the challenges they pose for risk managers. Risk, to state the obvious, is inherent in all business and financial activity. Its evaluation is a key element in all estimates of wealth. We are uncertain that any particular non-financial asset will be productive. We’re also uncertain about the flow of returns that the asset might engender. In the face of these uncertainties, we endeavor to estimate the most likely long-term earnings path and the potential for actual results to deviate from that path, that is, the asset’s risk. History suggests that day—to-day movements in asset values primarily reflect asset-specific uncertainties, but, especially at the portfolio level, changes in values are also driven by perceptions of uncertainties relating to the economy as a whole and to asset values generally. These perceptions of broad uncertainties are embodied in the discount factors that convert the expectations of future earnings to current present values, or wealth. In a market economy, all risks derive from the risks of holding real assets or, equivalently, unleveraged equity claims on those assets. All debt instruments (and, indeed, equities too) are essentially combinations of long and short positions in those real assets. The marvel of financial intermediation is that, although it cannot alter the underlying risk in holding direct claims on real assets, it can redistribute risks in a manner that alters behavior. The redistribution of risk induces more investment in real assets and hence engenders higher standards of living. This occurs because financial intermediation facilitates diversification of risk and its redistribution among people with different attitudes toward risk. Any means that shifts risk from those who choose to withdraw from it to those more willing to take it on permits increased investment without significantly raising the perceived degree of discomfort from risk that the population overall experiences. Indeed, all value added from new financial instruments derives from the service of reallocating risk in a manner that makes risk more tolerable. Insurance, of course, is the purest form of this service. All the new financial products that have been created in recent years, financial derivatives being in the forefront, contribute economic value by unbundling risks and reallocating them in a highly calibrated manner. The rising share of finance in the business output of the United States and other countries is a measure of the economic value added from its ability to enhance the process of wealth creation. But while financial intermediation, through its impetus to diversification, can lower the risks of holding claims on real assets, it cannot alter the more deep-seated uncertainties inherent in the human evaluation process. There is little in our historical annals that suggests that human nature has changed much over the generations. But, as I have noted previously, while time preference may appear to be relatively stable over history, perceptions of risk and uncertainty, which couple with time preference to create discount factors, obviously vary widely, as does liquidity preference, itself a function of uncertainty. These uncertainties are an underlying source of risk that we too often have regarded as background noise and generally have not endeavored to capture in our risk models. Almost always this has been the right judgment. However, the decline in recent years in the equity premium – the margin by which the implied rate of discount on common stock exceeds the riskless rate of interest – should prompt careful consideration of the robustness of our portfolio risk-management models in the event this judgment proves wrong. The key question is whether the recent decline in equity premiums is permanent or temporary. If the decline is permanent, portfolio risk managers need not spend much time revisiting a history that is unlikely to repeat itself. But if it proves temporary, portfolio risk managers could find that they are underestimating the credit risk of individual loans based on the market value of assets and overestimating the benefits of portfolio diversification. There can be little doubt that the dramatic improvements in information technology in recent years have altered our approach to risk. Some analysts perceive that information technology has permanently lowered equity premiums and, hence, permanently raised the prices of the collateral that underlies all financial assets. The reason, of course, is that information is critical to the evaluation of risk. The less that is known about the current state of a market or a venture, the less the ability to project future outcomes and, hence, the more those potential outcomes will be discounted. The rise in the availability of real-time information has reduced the uncertainties and thereby lowered the variances that we employ to guide portfolio decisions. At least part of the observed fall in equity premiums in our economy and others over the past five years does not appear to be the result of ephemeral changes in perceptions. It is presumably the result of a permanent technology-driven increase in information availability, which by definition reduces uncertainty and therefore risk premiums. This decline is most evident in equity risk premiums. It is less clear in the corporate bond market, where relative supplies of corporate and Treasury bonds and other factors we cannot easily identify have outweighed the effects of more readily available information about borrowers. The marked increase over this decade in the projected slope of technology advance, of course, has also augmented expectations of earnings growth, as evidenced by the dramatic increase since 1995 in security analysts’ projections of long-term earnings. While it may be that the expectations of higher earnings embodied in equity values have had a spillover effect on discount factors, the latter remain essentially independent of the earnings expectations themselves. That equity premiums have generally declined during the past decade is not in dispute. What is at issue is how much of the decline reflects new, irreversible technologies, and what part is a consequence of a prolonged business expansion without a significant period of adjustment. The business expansion is, of course, reversible, whereas the technological advancements presumably are not. Some analysts have offered an entirely different interpretation of the drop in equity premiums. They assert that a long history of a rate of return on equity persistently exceeding the riskless rate of interest is bound to induce a learning-curve response that will eventually close the gap. According to this argument, much, possibly all, of the decline in equity premiums over the past five years reflects this learning response. It would be a mistake to dismiss such notions out of hand. We have learned to no longer cower at an eclipse of the sun or to run for cover at the sight of a newfangled automobile. But are we really observing in today’s low equity premiums a permanent move up the learning curve in response to decades of data? Or are other factors at play? Some analysts have suggested several problems with the learning curve argument. One is the persistence of an equity premium in the face of the history of “excess” equity returns. Is it possible that responses toward risk are more akin to claustrophobia than to a learning response? No matter how many times one emerges unscathed from a claustrophobic experience, the sensitivity remains. In that case, there is no learning experience. Whichever case applies, what is certain is that the question of the permanence of the decline in equity premiums is of critical importance to risk managers. They cannot be agnostic on this question because any abrupt rise in equity premiums must inevitably produce declines in the values of most private financial obligations. Thus, however clearly they may be able to evaluate asset-specific risk, they must be careful not to overlook the possibilities of macro risk that could undermine the value of even a seemingly well-diversified portfolio. I have called attention to this risk-management challenge in a different context when discussing the roots of the international financial crises of the past two and a half years. My focus has been on the perils of risk management when periodic crises – read sharply rising risk premiums – undermine risk-management structures that fail to address them. During a financial crisis, risk aversion rises dramatically, and deliberate trading strategies are replaced by rising fear-induced disengagement. Yield spreads on relatively risky assets widen dramatically. In the more extreme manifestation, the inability to differentiate among degrees of risk drives trading strategies to ever-more-liquid instruments that permit investors to immediately reverse decisions at minimum cost should that be required. As a consequence, even among riskless assets, such as US Treasury securities, liquidity premiums rise sharply as investors seek the heavily traded “on-the-run” issues – a behavior that was so evident last fall. As I have indicated on previous occasions, history tells us that sharp reversals in confidence occur abruptly, most often with little advance notice. These reversals can be self-reinforcing processes that can compress sizable adjustments into a very short period. Panic reactions in the market are characterized by dramatic shifts in behavior that are intended to minimize short-term losses. Claims on far-distant future values are discounted to insignificance. What is so intriguing, as I noted earlier, is that this type of behavior has characterized human interaction with little appreciable change over the generations. Whether Dutch tulip bulbs or Russian equities, the market price patterns remain much the same. We can readily describe this process, but, to date, economists have been unable to anticipate sharp reversals in confidence. Collapsing confidence is generally described as a bursting bubble, an event incontrovertibly evident only in retrospect. To anticipate a bubble about to burst requires the forecast of a plunge in the prices of assets previously set by the judgments of millions of investors, many of whom are highly knowledgeable about the prospects for the specific investments that make up our broad price indexes of stocks and other assets. Nevertheless, if episodic recurrences of ruptured confidence are integral to the way our economy and our financial markets work now and in the future, the implications for risk measurement and risk management are significant. Probability distributions estimated largely, or exclusively, over cycles that do not include periods of panic will underestimate the likelihood of extreme price movements because they fail to capture a secondary peak at the extreme negative tail that reflects the probability of occurrence of a panic. Furthermore, joint distributions estimated over periods that do not include panics will underestimate correlations between asset returns during panics. Under these circumstances, fear and disengagement on the part of investors holding net long positions often lead to simultaneous declines in the values of private obligations, as investors no longer realistically differentiate among degrees of risk and liquidity, and to increases in the values of riskless government securities. Consequently, the benefits of portfolio diversification will tend to be overestimated when the rare panic periods are not taken into account. The uncertainties inherent in valuations of assets and the potential for abrupt changes in perceptions of those uncertainties clearly must be adjudged by risk managers at banks and other financial intermediaries. At a minimum, risk managers need to stress test the assumptions underlying their models and set aside somewhat higher contingency resources – reserves or capital – to cover the losses that will inevitably emerge from time to time when investors suffer a loss of confidence. These reserves will appear almost all the time to be a suboptimal use of capital. So do fire insurance premiums. More important, boards of directors, senior managers, and supervisory authorities need to balance emphasis on risk models that essentially have only dimly perceived sampling characteristics with emphasis on the skills, experience and judgment of the people who have to apply those models. Being able to judge which structural model best describes the forces driving asset pricing in any particular period is itself priceless. To paraphrase my former colleague Jerry Corrigan, the advent of sophisticated risk models has not made people with grey hair, or none, wholly obsolete.
|
board of governors of the federal reserve system
| 1,999 | 10 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the 1999 Financial Markets Conference of the Federal Reserve Bank of Atlanta, Sea Island, Georgia on 19 October 1999.
|
Mr Greenspan asks whether efficient financial markets mitigate financial crisis? Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the 1999 Financial Markets Conference of the Federal Reserve Bank of Atlanta, Sea Island, Georgia on 19 October 1999. * * * I am happy to address this conference, now in its eighth year, and endorse Atlanta Fed President Jack Guynn’s choice of topic. Many of us, with the benefit of hindsight, have been endeavoring for nearly two years to distil the critical lessons from the global crises of 1997 and 1998. Your contributions to this analysis are timely and useful. Knowing that you have touched on a number of topics over the last few days, I wanted to focus on some of the issues I raised at the most recent meetings of the IMF and World Bank; in particular, why the financial turmoil engendered by disruption in Asia resulted in a crisis longer and deeper than we expected in its early days. In a sense, I am turning the question posed by this conference – do efficient financial markets contribute to financial crises? – on its head by asking whether efficient financial markets mitigate financial crises. To answer the question, we need to look at the financial market situation of just over a year ago. Following the Russian default of August 1998, public capital markets in the United States virtually seized up. For a time, not even investment-grade bond issuers could find reasonable takers. While Federal Reserve easing shortly thereafter doubtless was a factor, it is not credible that this move fully explained the dramatic restoration of most, though not all, markets in a matter of weeks. The problems in our markets appeared too deep-seated to be readily unwound solely by a cumulative 75 basis point ease in overnight rates. Arguably, at least as important was the existence of backup financial institutions, especially commercial banks, that replaced the intermediation function of the public capital markets. As public debt issuance fell, commercial bank lending accelerated, effectively filling in some of the funding gap. Even though bankers also moved significantly to risk aversion, previously committed lines of credit, in conjunction with Federal Reserve ease, were an adequate backstop to business financing, and the impact on the real economy of the capital market turmoil was blunted. Firms were able to sustain production, and business and consumer confidence was not threatened. A vicious circle of the initial disruption leading to losses and then further erosion in the financial sector never got established. Capital market alternatives What we perceived in the United States in 1998 may reflect an important general principle: multiple alternatives to transform an economy’s savings into capital investment act as backup facilities should the primary form of intermediation fail. In 1998 in the United States, banking replaced the capital markets. Far more often it has been the other way around, as it was most recently in the United States a decade ago. When American banks stopped lending in 1990, as a consequence of a collapse in the value of real estate collateral, the capital markets were able to substitute for the loss of bank financial intermediation. Interestingly, the then recently developed mortgage-backed securities market kept residential mortgage credit flowing, which in prior years would have contracted sharply. Arguably, without the capital market backing, the mild recession of 1991 could have been far more severe. Our mild recession in 1991 offers a stark contrast with the long-lasting problems of Japan, whose financial system is an example of predominantly bank-based financial intermediation. The keiretsu conglomerate system, as you know, centres on a “main bank,” leaving corporations especially dependent on banks for credit. Thus, one consequence of Japan’s banking crisis has been a protracted credit crunch. Some Japanese corporations did go to the markets to pick up the slack. Domestic corporate bonds outstanding have more than doubled over the decade while total bank loans have been almost flat. Nonetheless, banks are such a dominant source of funding in Japan that this increase in nonbank lending has not been sufficient to avert a credit crunch. The Japanese government is injecting funds into the banking system in order to recapitalize it. While it has made some important efforts, it has yet to make significant progress in diversifying the financial system. This could be a key element, although not the only one, in promoting long-term recovery. Japan’s banking crisis is also ultimately likely to be much more expensive to resolve than the American crisis, again providing prima facie evidence that financial diversity helps limit the effect of economic shocks. This leads one to wonder how severe East Asia’s problems would have been during the past eighteen months had those economies not relied so heavily on banks as their means of financial intermediation. One can readily understand that the purchase of unhedged short-term dollar liabilities to be invested in Thai baht domestic loans would at some point trigger a halt in lending by Thailand’s banks if the dollar exchange rate did not hold. But why did the economy need to collapse when lending did? Had a functioning capital market existed, along with all the necessary financial infrastructure, the outcome might well have been far more benign. Before the crisis broke, there was little reason to question the three decades of phenomenally solid East Asian economic growth, largely financed through the banking system. The rapidly expanding economies and bank credit growth kept the ratio of nonperforming loans to total bank assets low. The failure to have backup forms of intermediation was of little consequence. The lack of a spare tyre is of no concern if you do not get a flat. East Asia had no spare tyres. Managing bank crises Banks, being highly leveraged institutions, have, throughout their history, periodically fallen into crisis. The classic problem of bank risk management is to achieve an always-elusive degree of leverage that creates an adequate return on equity without threatening default. The success rate has never approached 100 percent, except where banks are credibly guaranteed, usually by their governments, in the currency of their liabilities. But even that exception is by no means ironclad, especially when that currency is foreign. One can wonder whether in the United States of the nineteenth century, when banks were also virtually the sole intermediaries, numerous banking crises would have been as disabling if alternative means of intermediation were available. In dire circumstances, modern central banks have provided liquidity, but fear is not always assuaged by cash. Even with increased liquidity, banks do not lend in unstable periods. The Japanese banking system today is an example: the Bank of Japan has created massive liquidity, yet bank lending has responded little. But unlike the United States a decade ago, alternative sources of finance are not yet readily available. The case of Sweden’s banking crisis in the early 1990s, in contrast to America’s savings and loan crisis of the 1980s and Japan’s current banking crisis, illustrates another factor that often comes into play with banking sector problems: speedy resolution is good, whereas delay can significantly increase the fiscal and economic costs of a crisis. Resolving a banking-sector crisis often involves government outlays because of implicit or explicit government safety net guarantees for banks. Accordingly, the political difficulty in raising taxpayer funds has often meant delayed resolution. Delay, of course, can add to the fiscal costs and prolong a credit crunch. Experience tells us that alternatives within an economy for the process of financial intermediation can protect that economy when one of those financial sectors undergoes a shock. Australia serves as an interesting test case in the most recent Asian financial turmoil. Despite its close trade and financial ties to Asia, the Australian economy exhibited few signs of contagion from contiguous economies, arguably because Australia already had well-developed capital markets as well as a sturdy banking system. But going further, it is plausible that the dividends of financial diversity extend to more normal times as well. The existence of alternatives may well insulate all aspects of a financial system from breakdown. Diverse capital markets, aside from acting as backup to the credit process in times of stress, compete with a banking system to lower financing costs for all borrowers in more normal circumstances. Over the decades, capital markets and banking systems have interacted to create, develop and promote new instruments that improved the efficiency of capital creation and risk bearing in our economies. Products for the most part have arisen within the banking system, where they evolved from being specialized instruments for one borrower to having more standardized characteristics. At the point that standardization became sufficient, the product migrated to open capital markets, where trading expanded to a wider class of borrowers, tapping the savings of larger groups. Money market mutual funds, futures contracts, junk bonds, and asset-backed securities are all examples of this process at work. Once capital markets and traded instruments came into existence, they offered banks new options for hedging their idiosyncratic risks and shifted their business from holding to originating loans. Bank trading, in turn, helped these markets to grow. The technology-driven innovations of recent years have facilitated the expansion of this process to a global scale. Positions taken by international investors within one country are now being hedged in the capital markets of another: so-called proxy hedging. Building financial infrastructure But developments of the past two years have provided abundant evidence that where a domestic financial system is not sufficiently robust, the consequences for a real economy of participating in this new, complex global system can be most unwelcome. It is not surprising that banking systems emerge as the first financial intermediary in market economies as economic integration intensifies. Banks can marshal scarce information about the creditworthiness of borrowers to guide decisions about the allocation of capital. The addition of capital market alternatives is possible only if scarce real resources are devoted to building a financial infrastructure – a laborious process whose payoff is often experienced only decades later. The process is difficult to initiate, especially in emerging economies that are struggling to edge above the poverty level, because of the perceived need to concentrate on high short-term rates of return to capital rather than to accept more moderate returns stretched over a longer horizon. We must continually remind ourselves that a financial infrastructure is composed of a broad set of institutions whose functioning, like all else in a society, must be consistent with the underlying value system. On the surface, financial infrastructure appears to be a strictly technical concern. It includes accounting standards that accurately portray the condition of the firm, legal systems that reliably provide for the protection of property and the enforcement of contracts, and bankruptcy provisions that lend assurance in advance as to how claims will be resolved in the inevitable result that some business decisions prove to be mistakes. Such an infrastructure promotes transparency within enterprises and allows corporate governance procedures that facilitate the trading of claims on businesses using standardized instruments rather than idiosyncratic bank loans. But the development of such institutions almost invariably is moulded by the culture of a society. Arguably the notion of property rights in today’s Russia is subliminally biased by a Soviet education that inculcated a highly negative view of individual property ownership. The antipathy to the “loss of face” in Asia makes it difficult to institute, for example, the bankruptcy procedures of Western nations, and in the West we each differ owing to deep-seated views of creditor-debtor relationships. Corporate governance that defines the distribution of power invariably reflects the most profoundly held societal views about the appropriate interaction of parties in business transactions. It is thus not a simple matter to append a capital markets infrastructure to an economy developed without it. Accordingly, instituting convergence across countries of domestic financial infrastructures or even of the components tied to international transactions is a very difficult task. Indeed, weaknesses in financial infrastructure made Asian banking systems more vulnerable before the crisis and have impeded resolution of the crisis subsequently. Lack of transparency coupled with an implicit government guarantee for banks encouraged investors to lend too much to banks too cheaply, with the consequence that capital was not allocated efficiently. Poor bankruptcy laws and procedures have made recovery on nonperforming bank loans a long and costly procedure. Moreover, the lack of transparency and of a legal infrastructure for enforcing contracts and collecting debts in Russia are a prime cause of the dearth of financial intermediation in Russia at this time. Nonetheless, the competitive pressures toward convergence will be a formidable force in the future if, as I suspect, additional forms of financial intermediation are seen as benefiting an economy. Moreover, a broader financial infrastructure will likely also strengthen the environment for the banking system and enhance its performance. A recent study by Ross Levine and Sara Zervos suggests that financial market development improves economic performance, over and above the benefits offered by banking sector development alone. The results are consistent with the idea that financial markets and banks provide useful, but different, bundles of financial services and that utilizing both will almost surely result in a more robust and more efficient process of capital allocation. It is no coincidence that the lack of adequate accounting practices, bankruptcy provisions, and corporate governance have been mentioned as elements in several of the recent crises that so disrupted some emerging-market countries. Had these been present, along with the capital markets they would have supported, the consequences of the initial shocks of early 1997 might well have been quite different. It is noteworthy that the financial systems of most continental European countries escaped much of the turmoil of the past two years. And looking back over recent decades, we find fewer examples in continental Europe of banking crises sparked by real estate booms and busts or episodes of credit crunch of the sort I have mentioned in the United States and Japan. Until recently, the financial sectors of continental Europe were dominated by universal banks, and capital markets are still less well developed there than in the United States or the United Kingdom. The experiences of these universal banking systems may suggest that it is possible for some bankbased systems, when adequately supervised and grounded in a strong legal and regulatory framework, to function robustly. But these banking systems have also had substantial participation of publicly owned banks. Such institutions rarely exhibit the dynamism and innovation that many private banks have employed for their, and their economies’ prosperity. Government participation often distorts the allocation of capital to its most productive uses and undermines the reliability of price signals. But at times when market adjustment processes might have proved inadequate to prevent a banking crisis, such a government presence in the banking system can provide implicit guarantees of resources to keep credit flowing, even if its direction is suboptimal. In Germany, for example, publicly controlled banking groups account for nearly 40 percent of the assets of all banks taken together. Elsewhere in Europe, the numbers are less but still sizable. In short, there is some evidence to suggest that insurance against destabilizing credit crises has been purchased with a less efficient utilization of capital. It is perhaps noteworthy that this realization has helped engender a downsizing of public ownership of commercial banks in Europe, coupled with rapid development of heretofore modest capital markets, changes which appear to be moving continental Europe’s financial system closer to the structure evident in Britain and the United States. Continental European countries may gain an additional benefit from the increased development of their capital markets. With increased concentration of national banking systems, which will likely be followed by increased concentration of Europe-wide banking, comes the risk of an unusually large impact should the health of a megabank become impaired, causing the bank to curtail its lending. Having well-developed capital markets would likely help to mitigate these effects, as more firms would have alternative sources of funds. Conclusion Improving domestic banking systems in emerging markets will help to limit the toll of the next financial disturbance. But if, as I presume, diversity within the financial sector provides insurance against a financial problem turning into economy-wide distress, then steps to foster the development of capital markets in those economies should also have an especial urgency. Moreover, the difficult groundwork for building the necessary financial infrastructure - improved accounting standards, bankruptcy procedures, legal frameworks, and disclosure - will pay dividends of their own. The rapidly developing international financial system has clearly intensified competitive forces that have enhanced standards of living throughout most of the world. It is important that we develop domestic financial structures that facilitate and protect our international financial and trading systems, a process that will require much energy and commitment in the years ahead.
|
board of governors of the federal reserve system
| 1,999 | 10 |
Remarks by Mr Roger W Ferguson Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, before The Bond Market Association, New York, on 28 October 1999.
|
Mr Ferguson looks at financial market lessons for bankers and bank supervisors Remarks by Mr Roger W Ferguson Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, before The Bond Market Association, New York, on 28 October 1999. * * * I am delighted to be with you today to offer some remarks about financial markets and about how financial innovation and business practices are affecting the supervision and regulation of banks. As a result of the “opportunities” many of you and your colleagues have provided, we have learned much in recent years about risk management practices and about market dynamics during periods of stress. Today I would like to discuss three elements of risk management in banks, and more broadly, in financial services. These topics might be of interest to you because, I believe, many of you play an important role in the risk measurement, management and mitigation activities in your firms. In addition, as barriers between financial firms dissolve, either because of market action or, as now seems likely, legislative mandate, we should all learn the risk management techniques that are current in other segments of the financial services market. Bankers can and will learn from securities dealers and traders and vice versa. The topics that I would like to cover are lessons from last year’s turmoil, approaches banks take in measuring market and credit risk, and proposed changes in regulatory oversight. Lessons from last year’s turmoil I would like to begin this afternoon by reviewing some of the central findings of a study issued this month by the Bank for International Settlements dealing with market events in the autumn of last year and then turn to bank supervisory matters. These findings offer important lessons for all of us who are interested in maintaining efficient financial markets that are undisturbed by systemic risks. I would note that the full report, entitled A Review of Financial Market Events in Autumn 1998, is available on the BIS website, www.bis.org. A central point in the paper is that banks and many other market participants are leveraged institutions. As a consequence, they are vulnerable when things go wrong. And so are their creditors, and then their creditors, too. This use of leverage allows financial institutions to employ capital in the most efficient and effective ways so as to provide maximum benefits to our society. When it comes to banks in particular, the key question is to what degree they should be leveraged, and that, in turn, depends largely on how they manage risk. Risk management practices used by both banking and non-bank organizations have improved significantly in recent years. Nevertheless, some of these new, innovative techniques, or at least their application in many firms, were an element of some of the problems we saw last year. In particular, “relative value arbitrage” techniques - in which approximately offsetting positions are taken in similar, but not identical, financial instruments - played an important role. Had the instruments been identical and simply traded in different markets, the technique would have been one of classic arbitrage and virtually risk-free. In fact, these positions were not. They were taken - and taken on an increasingly large scale - because risk managers were confident that they could measure risks to their satisfaction using improved techniques and historical data sources. They were taken with the view that different prices for similar instruments would eventually converge, providing the holder with a profit. For a long period of time, the practice worked remarkably well. Another important factor was “proxy hedging”, in which traders took positions in larger, more liquid markets to offset exposures in more thinly traded markets. Hedging Russian securities with Hungarian or even Brazilian debt was an example. This risk management practice enabled traders to conduct more transactions than otherwise possible, but by its nature, it also tightened links across markets and altered price dynamics. The consequences, of course, are widely known. On the heels of earlier problems in Thailand, Indonesia, and other Asian countries, Russia’s default in August of last year caused investors worldwide to reassess risks and their views about conditions in emerging markets. A so-called “flight to quality” ensued, leaving still more turmoil in its wake. What can we learn, then, from this experience in terms of risk management practices and in supervising and regulating banks? For one, the world’s a dangerous place. That’s hardly news. In terms of financial markets, though, the experience illustrated quite vividly how closely linked world markets are today and the types of issues market participants and policymakers need to consider. Problems in Russia left their imprint on countries seemingly far removed, including Brazil. They also brought significant changes at a highly regarded US firm that was managed, in part, by leading financial market theorists and practitioners. It was a humbling and enlightening experience for us all. It should cause all of us to reassess our practices and our views about the underlying nature of market risks. As the BIS report makes clear, there are also more detailed lessons to learn. The report discusses nine lessons; I will pick three: • First, the inadequate assessment of counterparty risk, a task fundamental to lending and investment decisions, was in many ways at the core of the problem. Key market participants were allowed to grow through greater leverage and alter market terms in crucial ways, largely unchecked by traditional disciplines. Commercial banks, for whom judging credit risk is their life blood, were as guilty as any other institutions. Traditional practices of creditors of covering their exposures by requiring collateral that was marked-to-market proved insufficient as market values fell, creating a circular and expanding effect. • Second, market participants shared an insufficient recognition of the role of market liquidity in risk management. An important point here is the link between credit and market risk, and the fact that market prices can change sharply when key market participants pull out. In the proverbial “race for the door,” nearly everyone gets trampled. • The last point I will note relates to the over-reliance by practitioners on quantitative tools. Sophisticated measurement techniques can help greatly in providing insights about the dimensions of risk and sources of possible problems. But, like chains, models are only as strong as their weakest links. Every model has assumptions that must be tested, and its limitations must be understood. During periods of market stress nearly “all bets are off”. Business practices change, otherwise stable and expected correlations in market rates and prices disappear, and sometimes panic ensues. Well thought out and designed contingency plans and scenario analysis tailored to specific strategies and portfolios are necessary to prepare for these events and to evaluate an institution’s risks. That point was brought home last year. Measuring market and credit risk in banks Fortunately, progress is being made as banking organizations - typically the largest US and foreign institutions - find better ways to quantify their risks. With market risk - the “easy” one - the Federal Reserve and other regulators built on industry practices for measuring a bank’s “value at risk” when implementing a new regulatory capital standard for the banking system last year. Basing capital requirements on a bank’s internal calculations of its largest expected daily trading loss at a 99% confidence level was an important step, we thought. It produced a standard far more sensitive to changing levels of risk than was the earlier approach. It provided a reasonably consistent standard among banks and also was compatible with current management practice of the world’s more progressive banks. Last year’s events have not changed our view about the merits of this approach. In creating the standard, though, we tried to recognize the measure’s limitations and to incorporate sufficient buffers. Everyone recognized the possibility of large, statistically improbable losses, and that the measures commonly used underestimated the likelihood of those events. (We just didn’t realize that such extreme outcomes would occur so soon.) So we required an assumed 10-day holding period, rather than the conventional single day, in order to account for illiquid markets, and we multiplied the capital that would result from that adjustment by three. We also added a charge for “specific risk” to address issuer defaults and other matters. And finally, we required a management process that included crucial checks and balances and further work by banks toward stress testing, including testing involving scenario analysis. These were the “qualitative” aspects of the standard. Results of stress tests, for example, were to be considered subjectively by management in evaluating a bank’s market risks and overall capital adequacy. At the time, many of these elements were criticized as excessive, producing much too large capital requirements. The jury on that point may still be out, but at least the standard performed well last year. None of the US and foreign banks last year that were subject to this internal models approach incurred losses exceeding its capital requirements for market risk, although a few came relatively close. On the other hand, some banks had trading losses that occasionally exceeded their daily value-at-risk calculations during the volatile fourth quarter. My point is not that we were so smart in constructing the standard, but rather that we all still have much to learn. Risk measurement practices are advancing, and they need to. With credit risk, we are all feeling our way, again with the assistance of many large banks. Supervisors report that these institutions are making progress in measuring credit risk and are devoting increased attention and resources to the task. In my view, continued progress in this area is fundamentally important on many fronts. Continually declining costs in collecting, storing, and analyzing historical loss data; innovative ways to identify default risks, including the use of equity prices; and greater efforts by banks to build greater risk differentiation into their internal credit rating processes have been of great help. As a result, banks are developing better tools to price credit risk, and they are providing clearer, more accurate signals and incentives to personnel engaged in managing and controlling the risk. Through the Basel Committee on Banking Supervision, the Federal Reserve and other US and foreign bank supervisory agencies are working actively to design a more accurate, risk sensitive capital standard for credit risk than the one we have now. Full credit risk modeling seems currently beyond our reach, since industry practices have not sufficiently evolved. The Basel Committee expects, though, next year to propose an approach built on internal credit risk ratings of banks. Such a new standard would be a major step for bank supervision and regulation and will also have major implications for banks around the world. It is also a necessary step, we believe, if we are to keep pace with market practices and address developments that undermine current standards. Let me emphasize that the new credit risk approaches being contemplated will be applicable only to the larger, more sophisticated and complicated organizations. The vast majority of banks need not have their capital requirements modified in significant ways as we move away from a one-size-fits-all structure. In order to spur industry efforts in measuring risk, the Federal Reserve this past summer issued a new supervisory policy directing examiners to review the internal credit risk rating systems of large banks. That statement emphasized the need for banking organizations to ensure their capital was not only adequate in meeting regulatory standards, but also that it was sufficient to support all underlying risks. We issued the guidance recognizing the need to make clear progress in developing new capital standards and also with the view that the industry has important steps to take. Our earlier discussions with major institutions about their own processes for judging their capital adequacy supported that view. Too often they rely on the regulatory measure, itself, and on those calculations for their peers. The role of internal measures of economic risks in evaluating the level of firm-wide capital seemed generally weak and unclear. The need for a stronger connection between economic risks and capital is particularly great at institutions actively involved in complex securitizations and in other complex transfers of risk. We do not expect immediate results for most organizations, but we want to see clear and steady progress made by them. The other “pillars” Regulatory capital standards are important, but they are only part of a complete oversight process. To that point, the Basel Committee is building its approach on three so-called pillars: capital standards, supervision, and market discipline. Each pillar is important and connected with one another. Given the pace of transactions and the complexity of banking products, the Federal Reserve and other authorities need to rely increasingly on internal risk measures, information systems, and internal controls of banks. As I mentioned above, strong, more risk-sensitive capital requirements built on a bank’s internal model must also be reviewed periodically for their rigor and effectiveness. With the varying and somewhat subjective nature of internal measures, the matter of consistency among banks becomes important, both to banks and their supervisors. Additional public disclosures by banks and market discipline can help in that respect. Bank Supervision. In supervising banks, US regulators have recognized the need for an ongoing, more risk-focused approach, particularly for large, complex and internationally active banks. We constantly need to stay abreast of the nature of their activities and of their management and control processes. For these institutions, point-in-time examinations no longer suffice, and they have not sufficed for some time. We need assurance that these institutions will handle routine and non-routine transactions properly long after examiners leave the bank. We also need to tailor our on-site reviews to the circumstances and activities at each institution, so that our time is well spent understanding the bank’s management process and identifying weaknesses in key systems and controls. Nevertheless, the process still entails a certain amount of transaction testing. To accommodate this process, the Federal Reserve has established a separate supervisory program for large, complex banking organizations, or LCBOs. We believe these institutions require more specialized, ongoing oversight because of the size and dynamic nature of their activities. The program is more, though, than simply enhanced supervision of individual institutions. It involves a broader understanding of the potential systemic risk represented by this group of institutions. Currently, there are about 30 institutions in the group, although the figure can change. They are typically both major competitors and counterparties of one another and, combined, account for a substantial share of the systemic risk inherent in the US banking system. Management of this process revolves around a supervisory officer, designated as a “Central Point of Contact”, and a team of experienced staff members with skills suited to the business activities and risk profile of each institution. In large part, they will focus on internal management information systems and procedures for identifying and controlling risk. They will need to understand the risk management process as each institution implements it - by major business line, by type of risk, and so forth - in reaching overall judgments about corporate-wide risks. We believe this approach will best help supervisors keep abreast of risks and events and that it will also help us identify and strengthen weak areas within banks. The principal risk in banking relates, of course, to credit risk arising from lending. For most of this decade loan portfolios and bank earnings have been strong, helped largely by persistently strong economic growth. That performance has strengthened the industry’s financial statements and substantially improved its image with investors. As time has passed, however, it also may have allowed banks to let underwriting standards slip in the face of competitive pressures and the view that times will remain good. We know from history that they won’t. Indeed, recent industry figures suggest the condition of loan portfolios may be declining as delinquencies build from admittedly low levels. Through supervisory actions and guidance, we try to maintain prudent standards throughout the business cycle. Market Discipline. Market discipline has, in my view, two key purposes. The first is to link banks’ funding costs - both debt and equity - more closely to their risk-taking. This linkage has been more or less weakened by the safety net. Of course, a significant and growing proportion of the liabilities of large banks is in an uninsured or not fully insured form, so that linkage can be re-established. The cost of these funds, as well as the banks’ cost of equity capital, would clearly be affected by more disclosure of the risks in their portfolios. While banks already disclose considerable information, the balance between quantity and quality can be improved. Doing so should reduce the need for supervisors to intrude and should also affect a bank’s willingness to take risks, as its funding costs change. The second purpose of market discipline is to provide a supplementary source of information to the examination process. I have been impressed during my service on the Board as to the wide range of intelligence that our examination process now creates. But as banking organizations become more complex we are going to need all the help we can get, especially if we wish to avoid killing the goose that laid the golden egg through more intrusive supervision. Market discipline has some risks. It cannot be turned off once begun and could present its own problems during periods of generalized stress by creating additional pressures that authorities would prefer to avoid. In short, it can be a mixed blessing. As policymakers, we need to balance the risk it presents with the benefits it can provide in curbing excessive risk taking and preventing problems altogether. Conclusion In closing, we have seen important gains in risk management throughout this decade and substantial innovation in financial markets and products. These changes bode well, I believe, for distributing risks more efficiently and producing further gains in economic growth in the years to come. They may also, though, produce greater market volatility, as more sophisticated techniques for valuing financial assets identify the winners and losers with greater speed. We also learned that some of these techniques, until refined with experience, might also mislead their users. All of this presents continued challenges for central banks and financial supervisors. The best approach, I believe, is to move with the industry and conform oversight functions more closely to business practice. Supervisors can do much in this way to promote sound risk management around the globe and to provide banks with stronger incentives to manage and control their risks. It will require functional regulators to work together and with market participants, too. It will also require regulators to rely more on market discipline and to ensure that investors and others have meaningful information about the level and nature of financial risk. By providing leadership in reaching agreements about useful disclosures, we also can help there. Heavier supervision and regulation of banks and other financial firms is not a solution, despite the size of some institutions today and their potential for contributing to systemic risk. Increased oversight can undermine market discipline and contribute to moral hazard. Less reliance on governments and more on market forces is the key to preparing the financial system for the next millennium.
|
board of governors of the federal reserve system
| 1,999 | 11 |
Remarks by Mr Roger W Ferguson Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, before the National Economic Association in Boston on 7 January 2000.
|
Mr Ferguson compares Asian and Latin American experiences during the recent financial turmoil Remarks by Mr Roger W Ferguson Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, before the National Economic Association in Boston on 7 January 2000. * * * Now that the recent emerging markets crisis has been resolved - or at least is well on its way toward resolution - it is appropriate to step back and assess what we have learned from this episode. Perhaps more than anything else, the crisis has raised an array of questions that demand our consideration, including the following: what are the major macroeconomic and financial characteristics that make countries vulnerable to crises? What factors determine the depth and severity of such crises, both for a particular country and for the global financial system as a whole? What is the nature of contagion? Why does a crisis in a given country adversely affect other countries that often have little interaction with (or little similarity to) the initial crisis country? These questions are of more than just intellectual interest. They have enormous implications for the formulation of economic policy and directly affect the welfare of millions of people across the globe. The economics profession in recent years has made some progress in formulating answers to these questions, but the limits of our understanding remain substantial. Today, I will focus on a subset of these issues, examining an aspect of the recent emerging markets crisis that remains something of a puzzle. For many years, the Asian developing countries have been held up as one of the global economy’s shining success stories. Fueled by high domestic savings and investment rates, coupled with fiscal restraint and low inflation, real per capita GDP in these economies has risen dramatically since the 1960s. In contrast, during large portions of this period, the Latin American countries have been afflicted by a variety of economic ailments - including fiscal imbalances, capital flight, hyperinflation, and currency crashes - that have depressed domestic savings and investment and hindered economic growth. Judged by several measures, however, the Asian countries were hit much harder by the recent crisis than were the Latin American countries. Why was this the case? What vulnerabilities pushed these dynamic Asian economies into severe crisis? What economic policies or characteristics allowed the major Latin American countries to weather the storm with comparatively less damage? One apparent answer to these questions is that Latin America recently had endured a crisis - the 1994-95 peso devaluation. This observation, however, raises a complementary set of equally intriguing questions. Namely, in what sense does a crisis today inoculate a country against crisis tomorrow? Does this inoculation occur through improved macroeconomic fundamentals, stronger financial sector performance, or some other factors? In an effort to answer such questions, I will focus on two issues. First, I will discuss some evidence assessing the impact of the recent crisis on the emerging Asian and Latin American countries. Second, I will offer some working hypotheses that might explain the differential effects of the crisis on the two regions. Before going further, let me underscore the following point. The Asian economies have proven themselves to be very competitive in global markets and highly resilient. Consistent with this observation, the major Asian crisis countries appear to be well on their way to recovery, with the possible exception of Indonesia, which remains plagued by political instabilities. But notwithstanding the underlying strength of the Asian economies - or perhaps precisely because of their underlying strength - there is much to be learned in assessing why they fared poorly relative to their Latin American counterparts during the recent round of crises. Comparing the impact of the crisis The behavior of GDP provides a natural measure for assessing the severity of the recent crisis in the two regions (exhibit 1). The Asian crisis began in Thailand during July 1997, when a run on the country’s currency - the baht - forced the government to float the exchange rate. Not coincidentally, real GDP peaked in mid-1997 and fell more than 10% before reaching a trough during the second half of 1998.1 After the devaluation of the baht, the crisis spread to Malaysia, Indonesia, and the Philippines. Each of these countries was eventually forced to allow its currency to float more freely against the dollar, and each country also experienced a sharp contraction of its GDP. In Malaysia, output fell 10% between the third quarter of 1997 and the third quarter of 1998. Indonesia was even more severely afflicted, with GDP plunging more than 15% during that time. The Philippines was relatively less affected, however, as GDP declined just 3% from peak to trough. Korea, which was the last major Asian country hit by the crisis, endured sharp declines in the value of its currency and its domestic financial markets during late 1997 and early 1998. Korean GDP dropped about 8% during this period. Now turning to Latin America, this region first felt the effects of the financial crisis during the fall of 1997. At that time, Brazil - the center of the crisis in Latin America - raised interest rates to high levels and drew down its stock of international reserves to keep its currency - the real - from falling. However, pressures on Brazil re-emerged following the Russian devaluation in August 1998. The high interest rates necessary to defend the real became increasingly difficult to sustain, and the authorities floated the currency in January 1999. The real depreciated about one-third in the subsequent six months. To the surprise of most observers, Brazilian GDP fell only 3% from its peak during the second quarter of 1998 to a trough at the end of that year. Perhaps even more surprisingly, Brazilian inflation remained restrained after the devaluation, and the economy registered strong growth during the first half of 1999, nearly erasing the contraction that occurred during the second half of 1998. Ironically, several of Brazil’s neighbors have fared worse. Argentina successfully defended its exchange rate regime but has suffered a comparatively steep decline in economic activity. The high interest rates necessary to defend its peg to the dollar, along with the intensified uncertainty generated by the crisis and reduced competitiveness following the Brazilian devaluation, pushed Argentine GDP down 5% from mid-1998 to mid-1999. Activity in Chile, Colombia, and Ecuador has fallen by a comparable amount. Mexico’s experience, however, stands as an interesting contrast. For a number of reasons, which I will discuss later, economic activity in Mexico generally has remained strong. The country has endured only one quarter of negative GDP growth in the past three years. A second means of assessing the impact of the crisis on the two regions is the degree of external adjustment that occurred (exhibit 2 or exhibit 2A). Significantly, Thailand’s current account swung from a deficit of 8% of GDP in 1996 to a surplus of 12% of GDP in 1998. This startling adjustment in the current account balance was accomplished entirely by a compression of imports, as the dollar value of exports was about flat, but imports plunged 40%. The story for other major Asian crisis countries is broadly similar. In contrast, current account adjustment in the three major Latin American countries has been much less pronounced. All three countries have remained in deficit, with total adjustment during the crisis estimated to amount to only about $15 billion (compared with $120 billion in Asia) and current account deficits remaining around 3% to 5% of GDP. Thailand’s GDP declined during 1996:Q4 and 1997:Q1. The economy rebounded briskly during 1997:Q2, but GDP remained below its 1996 high. I have chosen 1997:Q2 (the local maximum) as the relevant pre-crisis peak. The extent of current account adjustment in Asia was necessitated in part by a sharp tailing off of private capital flows to the region (exhibit 3). Specifically, gross private financial flows to Korea, Malaysia, Thailand, and the Philippines plunged from $28 billion in the second half of 1996 to just $4 billion in the last half of 1998.2 Over this period, new bank loans issued to these economies fell nearly two-thirds, and bond issuance declined more than 90%. Private financial flows to Latin America, in contrast, did not decline severely until the second half of 1998. Flows to both regions rebounded during the first half of 1999. These data include bank loans received and issuance of bonds and equities. The data exclude interbank flows. The available evidence clearly indicates that the Asian developing countries experienced sharper GDP declines, more severe current account adjustment, and a steeper falloff in private capital flows during the recent crisis than did their Latin American counterparts. I will now consider four broad hypotheses that may account for the differential impact of the crisis on the two regions. First, were macroeconomic conditions in Asia in worse shape than those in Latin America? Asia’s growth performance in the years immediately preceding the crisis did not hint of disruptive imbalances. These economies were registering average annual growth of roughly 7% to 8%. While such rates of expansion might be considered unsustainably rapid, these countries had maintained such performance for many years without any apparent adverse effects. By comparison, the Latin Americans recorded more moderate growth rates in the years before the crisis. Inflation in both regions generally was well contained. What about current account imbalances (exhibit 4)? All of these countries ran current account deficits in the period before the crisis. Although the imbalances in Asia were somewhat larger than those in Latin America, such differences - with the possible exception of Thailand - do not appear to be particularly significant. Moreover, the current account performance of the three major Latin American countries deteriorated further during 1998. Were real exchange rates in Asia substantially overvalued relative to those in Latin America (exhibit 5)? During the first half of 1997, immediately before the onset of the crisis, the broad real exchange rates for Thailand, Indonesia, and Malaysia were only about 5% to 8% stronger than their 1990-96 average, and Korea’s real exchange rate was slightly weaker than its 1990-96 average. The real exchange rate for the Philippines was more than 20% above its average, but - as previously noted above - this country was relatively less affected by the crisis. Hence, there is little evidence of substantial overvaluation of exchange rates in these countries. In comparison, during the first half of 1998, before the onset of the Latin American phase of the crisis, the real exchange rates for Argentina and Brazil were about 10% stronger than the 1990-97 average, while Mexico’s real rate was about 5% stronger. Finally, the fiscal position of the Asian countries (exhibit 6) was no worse - and perhaps somewhat better - than that of the Latin American countries. Countries in both regions ran small deficits or slight surpluses. Brazil was the pronounced exception, with a fiscal deficit well over 5% of GDP. Accordingly, there seems to be little evidence that Asia’s macro fundamentals in the run-up to the crisis were significantly weaker than those for the Latin American countries. Indeed, indicators point to roughly comparable performance and risk in the two regions. A second hypothesis is that the financial systems in the Asian developing countries were beset with greater vulnerabilities than was the case in Latin America. This explanation seems to be more promising. Consider the case of Thailand. In the mid-1990s, the country’s domestic savings rate was about 35% of GDP, and the current account deficit was about 8% of GDP. This suggests that the financial system each year was intermediating investment flows on the order of 40% of GDP. It is conceivable that an economy with sound financial infrastructure and well-developed legal and regulatory institutions could have efficiently allocated such massive flows. However, in an emerging market economy that is still developing the institutions required to regulate, supervise, and support a viable financial system, it is not surprising that over time financial resources were misallocated and a substantial stock of bad loans emerged. In the absence of strong prudential regulation, the financial sector may have had inadequate incentives to assess risk properly and to monitor borrowers. From this perspective, the real puzzle may be why the Asian financial system did not falter sooner. Stated in slightly different terms, IMF statistics (exhibit 7) indicate that in 1996 bank claims on the private sector were about 100% of GDP in Thailand and about 60% of GDP in Korea. These numbers far exceed the ratios of roughly 20% of GDP that prevailed in Mexico, Argentina, and Brazil, countries that also experienced considerably slower domestic loan growth in the years preceding the recent crisis. The systemic risks implied by weak regulation may be much more severe in countries with relatively large banking systems. This observation is not unique to developing countries. Even Japan, one of the world’s advanced industrial countries, has struggled with a bad-loan problem, partially because of massive bank lending in an environment of inadequate prudential supervision. Moreover, countries where the banking system is large relative to GDP may be forced to endure more severe macroeconomic disruptions when a banking crisis actually occurs, since the costs of cleaning up the crisis are likely to be large compared with the size of the economy and firms are likely to be highly dependent on banks for the provision of credit. In light of these considerations, it is perhaps not surprising that Asian financial institutions were hit harder by the crisis than Latin American institutions. For example, Thailand was forced to close dozens of finance companies, and deep financial problems in two of Korea’s largest banks persuaded the authorities to temporarily nationalize those institutions. In addition, an enormous bad-loan problem in Thailand and Indonesia - and to some extent in the other Asian crisis countries, including Korea emerged after the crisis, thus creating uncertainty about the future economic performance and financial viability of these economies. In contrast, major Latin American countries moved to strengthen their financial systems following the peso crisis in the mid-1990s. As a result, banks in these countries appear to have sustained minimal new damage. Argentina is a notable example: efforts to strengthen the financial position of the banks, coupled with improvements in prudential regulation, have allowed the banks to remain relatively healthy, despite the effects of the crisis and the country’s protracted recession. This discussion suggests that financial system vulnerabilities probably were more severe in Asia than in Latin America. Such a conclusion, however, does not necessarily imply that Asian financial institutions were fundamentally less efficient or effective, only that they were more vulnerable when the crisis hit.3 Suffice it to say that policymakers in Latin America also are wrestling with sizable financial sector problems. A third hypothesis is that the crisis hit Asia harder because investor sentiment shifted more abruptly than it did in Latin America. In other words, one might be asked - did the crisis somehow reveal comparatively more information about the performance of the Asian developing countries and their economic prospects? The answer to this question seems to be “yes.” The major revelation - or wake-up call - that transformed the Asian devaluations into first-order crises was the (apparently sudden) realization on the part of investors that the Asian countries were suffering from deep structural imbalances, particularly in their financial sectors. This dramatic shift in sentiment was a central feature of the crisis in Asia. Importantly, there were no similar downside surprises in Latin America, because investors were already familiar with the region’s structural inefficiencies and difficulties. It is fair to say that memories of the peso devaluation in 1994-95 remained fresh in investors’ minds. The sudden shift in investor perceptions of the Asian countries is reflected in the dramatic downgrading of sovereign debt ratings that occurred between mid-1997 and early-1998 (exhibit 8). During this period, Standard & Poor’s downgraded Thailand four notches (from A to BBB-), Indonesia, six notches (from BBB to B), and Korea, ten notches (from AA- to B+). In contrast, S&P’s credit ratings for Argentina and Mexico were unchanged during the crisis (at BB), and Brazil’s rating was reduced only one notch (from BB- to B+).4 The less-pronounced reversal in sentiment regarding Latin America may be partially due to the fact that the crisis hit Asia first. This gave policymakers in Latin America an opportunity to implement pre-emptive measures, such as interest rate increases and initiatives to improve fiscal performance, that may have helped minimize subsequent damage. Another implication of the difference in timing was that “hot money” was able to leave Latin America gradually rather than in one frenetic rush. The Latin American countries also may have benefited because global economic policymakers - including the IMF - had learned from their experiences in Asia and Russia. In other words, consider the following: suppose that the Asian banking systems and the Latin American banking systems were asked to intermediate a similar quantity of funds. In which region would the resulting allocation of credit and level of financial sector performance be closest to the “optimal” outcome? The answer to this question is not immediately clear. The sovereign ratings assigned by Moody’s moved in a broadly similar fashion. A fourth hypothesis focuses on the speed and effectiveness of the policy response. Were policymakers in Asia slower to respond to the crisis than those in Latin America, and did this have a bearing on the severity of the crisis? The response to the crisis in the two regions differed in at least one important respect: the Latin Americans were much quicker to raise interest rates aggressively. Notably, short-term interest rates in Thailand did not peak until the second half of 1997, well after the baht was allowed to float. The authorities were hesitant to raise interest rates rapidly during the first half of that year, notwithstanding the fact that the currency had come under severe pressure. As an alternative strategy, the Thai authorities attempted to defend the pegged exchange rate regime through substantial foreign exchange intervention, which drained a significant fraction of official reserves and still failed to achieve the desired goal. In contrast, the major Latin American countries raised interest rates significantly as soon as they were hit by the shock waves from the crisis. More generally, Thailand’s experience highlights the fact that a half-hearted defense of a fixed exchange rate regime may drain reserves, reduce the credibility of the authorities, and ultimately prove unsuccessful. Accordingly, an emerging consensus in the policymaking community suggests that, when fixed exchange rate regimes come under attack, they either should be abandoned as quickly as possible or defended with all available instruments, including aggressive monetary tightening. This discussion suggests a further observation. Ironically, Latin America’s history of financial crises actually may have helped limit the damage done by the recent crisis. The regions’ previous crises have allowed policymakers (many of whom are well-trained technocrats) to gain significant experience in crisis management. The Asian economies, in contrast, had grown essentially without interruption for many years. Perhaps as a result, policymakers in the region may have been less prepared to deal with the crisis. Additionally, since the Latin American countries had much experience with economic crises, the turmoil associated with the recent crisis may have been less damaging to the sentiment of investors and the general public than was the case in Asia. It seems reasonable to hypothesize that, when economic crises are a fact of life, institutions and practices are likely to develop that make countries more resilient in the face of such crises. Lessons from Mexico’s recent experience I now turn to Mexico. As mentioned earlier, the Mexican economy has been particularly buoyant during the recent crisis. Are there any general lessons that can be drawn from Mexico’s experience? Although the Mexican authorities have raised interest rates at various times over the past two years to stabilize the peso, they nonetheless have allowed the peso to depreciate. It is probably fair to say that any adverse effects of peso depreciation are small compared with the costs that would have been borne had the authorities attempted to prevent the currency from weakening. This suggests that the policies required to defend a pegged exchange rate - not to mention the disruptions that are endured when such pegs are blown out - may leave countries with this sort of regime more vulnerable to downturn when crises occur. An additional factor that likely has contributed to Mexico’s resilience in recent years is its close relationship with the exuberant US economy. Roughly three-quarters of Mexican exports are purchased by the United States, representing about one-fifth of Mexican output. The strength of the US economy also has benefited Latin American countries more generally. On the other hand, the Asian countries are highly integrated with Japan. It may be more than coincidence that the Asian crisis erupted after a sustained depreciation of the yen and as Japan fell into a deep recession. Moreover, during the darkest days of the Asian crisis, Japanese imports from Asia declined sharply, and Japanese bank lending to these countries slowed. Developments in Japan almost certainly exacerbated the breadth and depth of the Asian crisis. Some conclusions What conclusions can be drawn from this discussion? First, and not surprisingly, developing countries that are hit with adverse shocks are likely to fare much better if they are highly integrated with a large, booming economy. This unfortunately is more a matter of luck (namely, the position of the partner country in its business cycle) than a clear prescription for economic policy. Second, during times of crisis, flexible exchange rate regimes appear to have major advantages relative to more rigid regimes. Flexible regimes may provide additional room for maneuver, since adjustment to an adverse shock can come through exchange rate depreciation. Flexible exchange rate regimes are also less costly to defend. Third, when adverse shocks arise, the authorities should move quickly to put appropriate policies in place to pre-empt potential difficulties. Particularly, in the context of a fixed exchange rate regime, monetary policy should be tightened aggressively to defend the peg, or the regime should be abandoned. A fourth lesson from the discussion is that strong macroeconomic fundamentals are necessary - but not sufficient - for avoiding crisis. Structural policies are also of first-order importance. Fifth, developing countries should move to strengthen their financial infrastructure, improve the efficiency of financial intermediation, and ensure that the regulations and incentives for proper credit assessment and monitoring are in place. This is particularly important for those countries with huge quantities of domestic savings; if appropriate measures are not implemented, such countries may face further financial crises in coming years. Finally, probably the major factor generating the sharp economic decline in Asia was the dramatic reassessment of prospects for the region, largely reflecting a “wake-up call” regarding the financial sector’s level of performance. Of necessity, investors will punish such downside surprises severely. To limit the magnitude of such downside surprises, policymakers should take action to maximize financial sector transparency and ensure appropriate disclosure of information.
|
board of governors of the federal reserve system
| 2,000 | 1 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Economic Club of New York, New York, on 13 January 2000.
|
Mr Greenspan discusses technology and the US economy Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Economic Club of New York, New York, on 13 January 2000. * * * We are within weeks of establishing a record for the longest economic expansion in this nation’s history. The 106-month expansion of the 1960s, which was elongated by the Vietnam War, will be surpassed in February. Nonetheless, there remain few evident signs of geriatric strain that typically presage an imminent economic downturn. Four or five years into this expansion, in the middle of the 1990s, it was unclear whether, going forward, this cycle would differ significantly from the many others that have characterized post-World War II America. More recently, however, it has become increasingly difficult to deny that something profoundly different from the typical postwar business cycle has emerged. Not only is the expansion reaching record length, but it is doing so with far stronger-than-expected economic growth. Most remarkably, inflation has remained subdued in the face of labor markets tighter than any we have experienced in a generation. Analysts are struggling to create a credible conceptual framework to fit a pattern of interrelationships that has defied conventional wisdom based on our economy’s history of the past half century. When we look back at the 1990s, from the perspective of say 2010, the nature of the forces currently in train will have presumably become clearer. We may conceivably conclude from that vantage point that, at the turn of the millennium, the American economy was experiencing a once-in-a-century acceleration of innovation, which propelled forward productivity, output, corporate profits, and stock prices at a pace not seen in generations, if ever. Alternatively, that 2010 retrospective might well conclude that a good deal of what we are currently experiencing was just one of the many euphoric speculative bubbles that have dotted human history. And, of course, we cannot rule out that we may look back and conclude that elements from both scenarios have been in play in recent years. On the one hand, the evidence of dramatic innovations - veritable shifts in the tectonic plates of technology - has moved far beyond mere conjecture. On the other, these extraordinary achievements continue to be bedeviled by concerns that the so-called New Economy is spurring imbalances that at some point will abruptly adjust, bringing the economic expansion, its euphoria, and wealth creation to a debilitating halt. This evening I should like to address some of the evidence and issues that pertain to these seemingly alternative scenarios. What should be indisputable is that a number of new technologies that evolved largely from the cumulative innovations of the past half century have now begun to bring about awesome changes in the way goods and services are produced and, especially, in the way they are distributed to final users. Those innovations, particularly the Internet’s rapid emergence from infancy, have spawned a ubiquity of startup firms, many of which claim to offer the chance to revolutionize and dominate large shares of the nation’s production and distribution system. Capital markets, not comfortable dealing with discontinuous shifts in economic structure, are groping for sensible evaluations of these firms. The exceptional stock price volatility of most of the newer firms and, in the view of some, their outsized valuations, are indicative of the difficulties of divining from the many, the particular few of the newer technologies and operational models that will prevail in the decades ahead. How did we arrive at such a fascinating and, to some, unsettling point in history? The process of innovation, of course, is never-ending. Yet the development of the transistor after World War II appears in retrospect to have initiated an especial wave of innovative synergies. It brought us the microprocessor, the computer, satellites, and the joining of laser and fiber-optic technologies. These, in turn, fostered by the 1990s an enormous new capacity to disseminate information. To be sure, innovation is not confined to information technologies. Impressive technical advances can be found in many corners of the economy. But it is information technology that defines this special period. The reason is that information innovation lies at the root of productivity and economic growth. Its major contribution is to reduce the number of worker hours required to produce the nation’s output. Yet, in the vibrant economic conditions that have accompanied this period of technical innovation, many more job opportunities have been created than have been lost. Indeed, our unemployment rate has fallen notably as technology has blossomed. One result of the more-rapid pace of IT innovation has been a visible acceleration of the process of “creative destruction,” a shifting of capital from failing technologies into those technologies at the cutting edge. The process of capital reallocation across the economy has been assisted by a significant unbundling of risks in capital markets made possible by the development of innovative financial products, many of which themselves owe their viability to advances in IT. Before this revolution in information availability, most twentieth-century business decisionmaking had been hampered by wide uncertainty. Owing to the paucity of timely knowledge of customers’ needs and of the location of inventories and materials flowing throughout complex production systems, businesses, as many of you well remember, required substantial programmed redundancies to function effectively. Doubling up on materials and people was essential as backup to the inevitable misjudgments of the real-time state of play in a company. Decisions were made from information that was hours, days, or even weeks old. Accordingly, production planning required costly inventory safety stocks and backup teams of people to respond to the unanticipated and the misjudged. Large remnants of information void, of course, still persist, and forecasts of future events on which all business decisions ultimately depend are still unavoidably uncertain. But the remarkable surge in the availability of more timely information in recent years has enabled business management to remove large swaths of inventory safety stocks and worker redundancies. Information access in real time - resulting, for example, from such processes as electronic data interface between the retail checkout counter and the factory floor or the satellite location of trucks has fostered marked reductions in delivery lead times and the related workhours required for the production and delivery of all sorts of goods, from books to capital equipment. The dramatic decline in the lead times for the delivery of capital equipment has made a particularly significant contribution to the favorable economic environment of the past decade. When lead times for equipment are long, the equipment must have multiple capabilities to deal with the plausible range of business needs likely to occur after these capital goods are delivered and installed. With lead times foreshortened, many of the redundancies built into capital equipment to ensure that it could meet all plausible alternatives of a defined distant future could be sharply reduced. That means fewer goods and worker hours are caught up in activities that, while perceived as necessary insurance to sustain valued output, in the end produce nothing of value. Those intermediate production and distribution activities, so essential when information and quality control were poor, are being reduced in scale and, in some cases, eliminated. These trends may well gather speed and force as the Internet alters relationships of businesses to their suppliers and their customers. The process of innovation goes beyond the factory floor or distribution channels. Design times and costs have fallen dramatically as computer modeling has eliminated the need, for example, of the large staff of architectural specification-drafters previously required for building projects. Medical diagnoses are more thorough, accurate, and far faster, with access to heretofore unavailable information. Treatment is accordingly hastened, and hours of procedures eliminated. Indeed, these developments emphasize the essence of information technology - the expansion of knowledge and its obverse, the reduction in uncertainty. As a consequence, risk premiums that were associated with all forms of business activities have declined. Because the future is never entirely predictable, risk in any business action committed to the future that is, virtually all business actions - can be reduced but never eliminated. Information technologies, by improving our real-time understanding of production processes and of the vagaries of consumer demand, are reducing the degree of uncertainty and, hence, risk. In short, information technology raises output per hour in the total economy principally by reducing hours worked on activities needed to guard productive processes against the unknown and the unanticipated. Narrowing the uncertainties reduces the number of hours required to maintain any given level of production readiness. In economic terms, we are reducing risk premiums and variances throughout the economic decision tree that drives the production of our goods and services. This has meant that employment of scarce resources to deal with heightened risk premiums has been reduced. The relationship between businesses and consumers already is being changed by the expanding opportunities for e-commerce. The forces unleashed by the Internet are almost surely to be even more potent within and among businesses, where uncertainties are being reduced by improving the quantity, the reliability, and the timeliness of information. This is the case in many recent initiatives, especially among our more seasoned companies, to consolidate and rationalize their supply chains using the Internet. Not all technologies, information or otherwise, however, increase productivity - that is, output per hour - by reducing the inputs necessary to produce existing products. Some new technologies bring about new goods and services with above average value added per workhour. The dramatic advances in biotechnology, for example, are significantly increasing a broad range of productivity-expanding efforts in areas from agriculture to medicine. Indeed, in our dynamic labor markets, the resources made redundant by better information, as I indicated earlier, are being drawn to the newer activities and newer products, many never before contemplated or available. The personal computer, with ever-widening applications in homes and businesses, is one. So are the fax and the cell phone. The newer biotech innovations are most especially of this type, particularly the remarkable breadth of medical and pharmacological product development. At the end of the day, however, the newer technologies obviously can increase outputs or reduce inputs and, hence, increase productivity only if they are embodied in capital investment. Capital investment here is defined in the broadest sense as any outlay that enhances future productive capabilities and, consequently, capital asset values. But for capital investments to be made, the prospective rate of return on their implementation must exceed the cost of capital. Gains in productivity and capacity per real dollar invested clearly rose materially in the 1990s, while the increase in equity values, reflecting that higher earnings potential, reduced the cost of capital. In particular, technological synergies appear to be engendering an ever-widening array of prospective new capital investments that offer profitable cost displacement. In a consolidated sense, reduced cost generally means reduced labor cost or, in productivity terms, fewer hours worked per unit of output. These increased real rates of return on investment and consequent improved productivity are clearly most evident among the relatively small segment of our economy that produces high-tech equipment. But the newer technologies are spreading to firms not conventionally thought of as high tech.1 It would be an exaggeration to imply that whenever a cost increase emerges on the horizon, there is a capital investment that is available to quell it. Yet the veritable explosion of high-tech equipment and software spending that has raised the growth of the capital stock dramatically over the past five years could hardly have occurred without a large increase in the pool of profitable projects becoming available to business planners. As rising productivity growth in the high-tech sector since 1995 has Since the early 1990s, the annual growth rate in output per hour of non-financial corporate businesses outside high tech has risen by a full percentage point. resulted in an acceleration of price declines for equipment embodying the newer technologies, investment in this equipment by firms in a wide variety of industries has expanded sharply. Had high prospective returns on these capital projects not materialized, the current capital equipment investment boom - there is no better word - would have petered out long ago. In the event, overall equipment and capitalized software outlays as a percentage of GDP in nominal dollars have reached their highest level in post-World War II history. To be sure, there is also a virtuous capital investment cycle at play here. A whole new set of profitable investments raises productivity, which for a time raises profits - spurring further investment and consumption. At the same time, faster productivity growth keeps a lid on unit costs and prices. Firms hesitate to raise prices for fear that their competitors will be able, with lower costs from new investments, to wrest market share from them. Indeed, the increasing availability of labor-displacing equipment and software, at declining prices and improving delivery lead times, is arguably at the root of the loss of business pricing power in recent years. To be sure, other inflation-suppressing forces have been at work as well. Marked increases in available global capacity were engendered as a number of countries that were previously members of the autarchic Soviet bloc opened to the West, and as many emerging-market economies blossomed. Reductions in Cold War spending in the United States and around the world also released resources to more productive private purposes. In addition, deregulation that removed bottlenecks and hence increased supply response in many economies, especially ours, has been a formidable force suppressing price increases as well. Finally, the global economic crisis of 1997 and 1998 reduced the prices of energy and other key inputs into production and consumption, helping to hold down inflation for several years. Of course, Europe and Japan have participated in this recent wave of invention and innovation and have full access to the newer technologies. However, they arguably have been slower to apply them. The relatively inflexible and, hence, more costly labor markets of these economies appear to be an important factor. The high rates of return offered by the newer technologies are largely the result of labor cost displacement, and because it is more costly to dismiss workers in Europe and Japan, the rate of return on the same equipment is correspondingly less there than in the United States. Here, labor displacement is more readily countenanced both by law and by culture, facilitating the adoption of technology that raises standards of living over time. There, of course, has been a substantial amount of labor-displacing investment in Europe to obviate expensive increased employment as their economies grow. But it is not clear to what extent such investment has been directed at reducing existing levels of employment. It should always be remembered that in economies where dismissing a worker is expensive, hiring one will also be perceived to be expensive. An ability to reorganize production and distribution processes is essential to take advantage of newer technologies. Indeed, the combination of a marked surge in mergers and acquisitions, and especially the vast increase in strategic alliances, including across borders, is dramatically altering business structures to conform to the imperatives of the newer technologies.2 We are seeing the gradual breaking down of competition-inhibiting institutions from the keiretsu and chaebol of East Asia, to the dirigisme of some of continental Europe. The increasingly evident advantages of applying the newer technologies is undermining much of the old political wisdom of protected stability. The clash between unfettered competitive technological advance and protectionism, both domestic and international, will doubtless engage our attention for many years into this new century. The turmoil in Seattle last month may be a harbinger of an intensified debate. For example, the emergence of many alternate technologies in areas where only one or two will set the standard and survive has created high-risk, high-reward outcomes for their creators. The desire to spread risk (and the willingness to forgo the winner-take-all return) has fostered a substantial number of technology-sharing alliances. However one views the causes of our low inflation and strong growth, there can be little argument that the American economy as it stands at the beginning of a new century has never exhibited so remarkable a prosperity for at least the majority of Americans. Nonetheless, this seemingly beneficial state of affairs is not without its own set of potential challenges. Productivity-driven supply growth has, by raising long-term profit expectations, engendered a huge gain in equity prices. Through the so-called “wealth effect,” these gains have tended to foster increases in aggregate demand beyond the increases in supply. It is this imbalance between growth of supply and growth of demand that contains the potential seeds of rising inflationary and financial pressures that could undermine the current expansion. Higher productivity growth must show up as increases in real incomes of employees, as profit, or more generally as both. Unless the propensity to spend out of real income falls, private consumption and investment growth will rise, as indeed it must, since over time demand and supply must balance. (I leave the effect of fiscal policy for later.) If this was all that happened, accelerating productivity would be wholly benign and beneficial. But in recent years, largely as a result of the appreciating values of ownership claims on the capital stock, themselves a consequence, at least in part, of accelerating productivity, the net worth of households has expanded dramatically, relative to income. This has spurred private consumption to rise even faster than the incomes engendered by the productivity-driven rise in output growth. Moreover, the fall in the cost of equity capital corresponding to higher share prices, coupled with enhanced potential rates of return, has spurred private capital investment. There is a wide range of estimates of how much added growth the rise in equity prices has engendered, but they center around 1 percentage point of the somewhat more than 4 percentage point annual growth rate of GDP since late 1996. Such overall extra domestic demand can be met only with increased imports (net of exports) or with new domestic output produced by employing additional workers. The latter can come only from drawing down the pool of those seeking work or from increasing net immigration. Thus, the impetus to spending from the wealth effect by its very nature clearly cannot persist indefinitely. In part, it adds to the demand for goods and services before the corresponding increase in output fully materializes. It is, in effect, increased purchasing from future income, financed currently by greater borrowing or reduced accumulation of assets. If capital gains had no evident effect on consumption or investment, their existence would have no influence on output or employment either. Increased equity claims would merely match the increased market value of productive assets, affecting only balance sheets, not flows of goods and services, not supply or demand, and not labor markets. But this is patently not the case. Increasing perceptions of wealth have clearly added to consumption and driven down the amount of saving out of current income and spurred capital investment. To meet this extra demand, our economy has drawn on all sources of added supply. Our net imports and current account deficits have risen appreciably in recent years. This has been financed by foreign acquisition of dollar assets fostered by the same sharp increases in real rates of return on American capital that set off the wealth effect and domestic capital goods boom in the first place. Were it otherwise, the dollar’s foreign exchange value would have been under marked downward pressure in recent years. We have also relied on net immigration to augment domestic output. And finally, we have drawn down the pool of available workers. The bottom line, however, is that, while immigration and imports can significantly cushion the consequences of the wealth effect and its draining of the pool of unemployed workers for a while, there are limits. Immigration is constrained by law and its enforcement; imports, by the willingness of global investors to accumulate dollar assets; and the draw down of the pool of workers by the potential emergence of inflationary imbalances in labor markets. Admittedly, we are groping to infer where those limits may be. But that there are limits cannot be open to question. However one views the operational relevance of a Phillips curve or the associated NAIRU (the nonaccelerating inflation rate of unemployment) - and I am personally decidedly doubtful about it there has to be a limit to how far the pool of available labor can be drawn down without pressing wage levels beyond productivity. The existence or nonexistence of an empirically identifiable NAIRU has no bearing on the existence of the venerable law of supply and demand. To be sure, increases in wages in excess of productivity growth may not be inflationary, and destructive of economic growth, if offset by decreases in other costs or declining profit margins. A protracted decline in margins, however, is a recipe for recession. Thus, if our objective of maximum sustainable economic growth is to be achieved, the pool of available workers cannot shrink indefinitely. As my late friend and eminent economist Herb Stein often suggested: if a trend cannot continue, it will stop. What will stop the wealth-induced excess of demand over productivity-expanded supply is largely developments in financial markets. That process is already well advanced. For the equity wealth effect to be contained, either expected future earnings must decline, or the discount factor applied to those earnings must rise. There is little evidence of the former. Indeed, security analysts, reflecting detailed information on and from the companies they cover, have continued to revise upward long-term earnings projections. However, real rates of interest on long-term BBB corporate debt, a good proxy for the average of all corporate debt, have already risen well over a full percentage point since late 1997, suggesting increased pressure on discount factors.3 This should not be a surprise because an excess of demand over supply ultimately comes down to planned investment exceeding saving that would be available at the economy’s full potential. In the end, balance is achieved through higher borrowing rates. Thus, the rise in real rates should be viewed as a quite natural consequence of the pressures of heavier demands for investment capital, driven by higher perceived returns associated with technological breakthroughs and supported by a central bank intent on defusing the imbalances that would undermine the expansion. We cannot predict with any assurance how long a growing wealth effect - more formally, a rise in the ratio of household net worth to income - will persist, nor do we suspect can anyone else. A diminution of the wealth effect, I should add, does not mean that prices of assets cannot keep rising, only that they rise no more than income. A critical factor in how the rising wealth effect and its ultimate limitation will play out in the market place and the economy is the state of government, especially federal, finances. The sharp rise in revenues (at a nearly 8% annual rate since 1995) has been significantly driven by increased receipts owing to realized capital gains and increases in compensation directly and indirectly related to the huge rise in stock prices. Both the Administration and the Congress have chosen wisely to allow unified budget surpluses to build and have usefully focused on eliminating the historically chronic borrowing from social security trust funds to finance current outlays. The growing unified budget surpluses have absorbed a good part of the excess of potential private demand over potential supply. A continued expansion of the surplus would surely aid in sustaining the productive investment that has been key to leveraging the opportunities provided by new technology, while holding down a further reliance on imports and absorption of the pool of available workers. I trust that the recent flurry of increased federal government outlays, seemingly made easier by the emerging surpluses, is an aberration. In today’s environment of rapid innovation, growing unified budget surpluses can obviate at least part of the rebalancing pressures evident in marked increases in real long-term interest rates. The inflation expectations employed in this calculation are those implicit in the gap between the interest rates on ten-year Treasury inflation-indexed notes and those on a nominal security derived from Treasury STRIPS constructed to have comparable duration. The latter are used because they have the same relatively limited liquidity as inflation-indexed notes. As I noted at the beginning of my remarks, it may be many years before we fully understand the nature of the rapid changes currently confronting our economy. We are unlikely to fully comprehend the process and its interactions with asset prices until we have been through a complete business cycle. Regrettably, we at the Federal Reserve do not have the luxury of awaiting a better set of insights into this process. Indeed, our goal, in responding to the complexity of current economic forces, is to extend the expansion by containing its imbalances and avoiding the very recession that would complete a business cycle. If we knew for sure that economic growth would soon be driven wholly by gains in productivity and growth of the working age population, including immigration, we would not need to be as concerned about the potential for inflationary distortions. Clearly, we cannot know for sure, because we are dealing with world economic forces which are new and untested. While we endeavor to find the proper configuration of monetary and fiscal policies to sustain the remarkable performance of our economy, there should be no ambiguity on the policies required to support enterprise and competition. I believe that we as a people are very fortunate: when confronted with the choice between rapid growth with its inevitable insecurities and a stable, but stagnant economy, given time, Americans have chosen growth. But as we seek to manage what is now this increasingly palpable historic change in the way businesses and workers create value, our nation needs to address the associated dislocations that emerge, especially among workers who see the security of their jobs and their lives threatened. Societies cannot thrive when significant segments perceive its functioning as unjust. It is the degree of unbridled fierce competition within and among our economies today - not free trade or globalization as such - that is the source of the unease that has manifested itself, and was on display in Seattle a month ago. Trade and globalization are merely the vehicles that foster competition, whose application and benefits currently are nowhere more evident than here, today, in the United States. Confronted face-on, no one likes competition; certainly, I did not when I was a private consultant vying with other consulting firms. But the competitive challenge galvanized me and my colleagues to improve our performance so that at the end of the day we and, indeed, our competitors, and especially our clients, were more productive. There are many ways to address the all too real human problems that are the inevitable consequences of accelerating change. Restraining competition, domestic or international, to suppress competitive turmoil is not one of them. That would be profoundly counterproductive to rising standards of living. We are in a period of dramatic gains in innovation and technical change that challenge all of us, as owners of capital, as suppliers of labor, as voters and policymakers. How well policy can be fashioned to allow the private sector to maximize the benefits of innovations that we currently enjoy, and to contain the imbalances they create, will shape the economic configuration of the first part of the new century.
|
board of governors of the federal reserve system
| 2,000 | 1 |
Remarks by Mr Edward M Gramlich, Member of the Board of Governors of the US Federal Reserve System, before the Charlotte Economics Club, Charlotte, NC on 13 January 2000.
|
Mr Gramlich focuses on inflation targeting Remarks by Mr Edward M Gramlich, Member of the Board of Governors of the US Federal Reserve System, before the Charlotte Economics Club, Charlotte, NC on 13 January 2000. * * * At one time or another, opinions on how central banks should operate have focused on fixing exchange rates, stabilizing the rate of growth of the money supply, smoothing the growth of nominal income, or setting short-term interest rates through an instrument rule. But lately both academic economists and central bankers have become enamored of a new procedure that may have more staying power than these other approaches and that has certainly been more widely adopted around the world. Called inflation targeting, this new procedure essentially commits a country’s central bank to hit an inflation target, usually expressed as a low, positive rate of inflation subject to some margin of error and some allowance for outside price shocks. New Zealand was the first country to adopt a formal inflation targeting regime back in 1990. It was soon followed by Canada, the United Kingdom, Sweden, and Australia. The European Central Bank at least alludes to inflation targeting in its strategy statements, as do many countries in Eastern Europe that aspire to join the European Union. At least ten emerging-market countries have adopted inflation targeting as a way to correct persistent inflation problems. In the last few months alone, Turkey, Switzerland, and South Africa have announced that they are switching to inflation targeting. When tallied up, the number of countries on either a formal or an informal inflation targeting regime now approaches thirty. Perhaps more meaningfully, there seems to be no country that has first tried inflation targeting and then abandoned it. One apparent holdout is the United States. While US central bankers have often stressed the paramount importance of controlling inflation, the United States has never adopted a formal inflation target or an inflation targeting regime. The Federal Reserve operates under the Federal Reserve Act, which requires the Fed to try to achieve maximum employment along with price stability. But both in the academic community and in the halls of the Congress, there are advocates for change. In a series of articles and books, Bernanke, Laubach, Mishkin, and Posen (1999) have proposed that the United States adopt an explicit inflation targeting regime. Senator Connie Mack has introduced a bill to this effect in the Congress, so far not adopted. Recognizing that the question of whether the United States should adopt inflation targeting is ultimately a congressional prerogative, one could still ask the normative question of whether the United States should go to inflation targeting. The question is difficult to answer. While inflation targeting seems to have been successful around the world, the preconditions for success may not be relevant for this country. Given the strong US commitment to controlling inflation, in the end there may be little difference between the way monetary policy already is practiced in the United States and the way it is practiced under the flexible, forward-looking inflation targeting regimes followed by many countries. Finally, although one can find economic circumstances in which inflation targeting will work well, it is also possible to imagine circumstances in which even forward-looking, flexible inflation targeting may not work so well, some of these circumstances from the fairly recent past of the United States. Basic aspects of inflation targeting Describing an inflation targeting regime is straightforward. A country or its central bank commits to controlling inflation, with an explicit target rate and usually a tolerance band around this target rate. Obvious price shocks such as indirect taxes, commodity prices, or interest rates themselves are usually excluded in the calculation of inflation targets. Many inflation targeting regimes permit flexibility for pursuing other goals, such as output stabilization, though the primary commitment of the central bank is clearly to control inflation. Given the lags in monetary policy, many regimes also are forward-looking, in the sense that the central bank operates not against current inflation but expected inflation in the near future. Three rationales normally are given for the adoption of inflation targeting regimes: the provision of a nominal anchor for policy, transparency, and credibility. A nominal anchor may be required if countries permit their exchange rates to vary and if they do not target either the growth of monetary quantities or nominal income. Governments or their central banks may need such an anchor to stabilize inflation, and they can generate the anchor by announcing an inflation target and then doing what they must do to hit that target. Such an approach would also lead to more transparent monetary policies, as economic actors would better understand the goals of monetary authorities. To the extent that monetary authorities can hit their target, central banks would also gain credibility, which many need after years of inflation. The ideals of transparency and credibility certainly have democratic value in their own right, but they may also pay off in narrower economic terms. It is commonly argued that inflationary expectations are a key aspect of the inflation process. In lowering inflationary expectations, inflation targeting can itself help reduce inflationary pressures. One can also rationalize inflation targeting through another form of economic reasoning. Some years ago many believed, along with Milton Friedman, that stabilizing the growth of the money supply would lead to stable prices. But this approach is now generally discredited because shocks in the demand for money and an unstable transmission mechanism imply that stable growth of monetary aggregates could lead to quite unstable behavior for prices and real incomes. The next step was to follow Bennett McCallum (1988) and avoid inappropriate responses to shocks in the demand for money by having the central bank simply stabilize nominal income growth. But again, if there were shocks in this nominal income growth, say productivity shocks, stabilizing nominal income growth would not necessarily stabilize prices. The same productivity shocks have led to difficulties with the instrument rule proposed by John Taylor (1993), which requires either a predictable rate of growth of potential output or a predictable natural rate of unemployment. As these other procedures for conducting monetary policy have run into difficulty, academic economists have increasingly drifted to the straightforward view of Bernanke, Laubach, Mishkin, Posen and many others that, if central banks want to stabilize prices, they should just do that by inflation targeting. But the migration of academic economists to inflation targeting is nothing compared with the migration of actual real world countries to inflation targeting. The earliest and still most elaborate procedures were adopted by New Zealand, where the parliamentary government in 1990 began negotiating inflation targets with its newly independent central bank, making these targets public, and holding the bank responsible for hitting the targets. Other regimes came later and were less elaborate, but by now a great many countries have regimes in which they publish inflation targets, have the central bank commit to meeting these targets, and comment on the progress in meeting the targets. Because all central banks in the world are responsible for controlling inflation, it is reasonable to ask how explicit inflation targeting regimes differ from non-targeting regimes. From a legislative standpoint, the differences seem reasonably clear. Inflation targeting regimes have explicit inflation targets, explicit commitments of the central bank to meet these targets, and less formal commitments to achieve other goals, such as output stabilization. But from a practical standpoint, the differences could be much less distinct. On one side, even non-targeting countries often will be strongly committed to controlling inflation. On the other side, countries that target inflation flexibly and in a forward-looking manner may also strive to reduce output variability, perhaps because it helps to stabilize future inflation. As will be discussed below, an empirical analysis by Cecchetti and Ehrmann (1999) does not find large differences in actual policy parameters between the two sets of countries. All existing inflation targets around the world are for low, positive rates of inflation. For developed countries with stable inflation rates, the world average target rate of inflation is around 2%, with an acceptable band that normally ranges from 1% to 3%. Target levels are higher, but are promised to be stepped down gradually over time, for emerging-market countries that are trying to bring inflation down from very high levels. There is academic interest in targeting future price levels as opposed to inflation rates. The two approaches differ mainly in their response to past errors: is the central bank to be held responsible for offsetting these past errors and getting the price level back on track or just for stabilizing inflation from this time forward? But in practice this distinction may not be that important. King (1999) shows that, if a long enough interval is given to hit the target, there may be little difference between a price level target and an inflation rate target. In any event, no country now targets the future price level. However, there could be an important difference between a target of a low positive rate of inflation and one of a zero rate of inflation. Many potential inflation targeters ask, “Why not zero?” There are three reasons for targeting for an inflation rate above zero. The first is measurement bias. Try as they might, most countries do have some bias in their price indexes. It is hard for governmental statistical agencies to eliminate the measurement bias that occurs whenever new and improved goods are introduced to consumers, and new and improved goods are continually being introduced. It is also hard to deal with substitution bias by updating the weights on various consumer goods. Measurement bias is not huge around the world, and it is coming down as statistical agencies adopt new and improved statistical procedures. But there may still be some irreducible upward bias in measuring inflation. The second reason for shooting at a rate of inflation slightly above zero is known as the zero bound problem. If a country’s real interest rates are close to zero and its inflation rate is close to zero, its nominal interest rates will also be close to zero. Since costs of holding cash are minimal, a central bank cannot push nominal interest rates much below zero. This means that countries that target for zero inflation could get in the bind of being unable to ease monetary policy in response to recessionary shocks. Today this issue is not much of a problem around most of the world, but it has become a significant problem in Japan. The balance of economic thought on the issue is that, once a country gets into this zero bound situation, it has a very difficult time getting out. This forms a strong rationale for avoiding the danger in the first place, which can be helped by targeting for a low positive rate of inflation. The third reason for targeting a low positive rate of inflation is labor market inefficiencies. Akerlof, Dickens, and Perry (1996) argue that these can be lessened with some positive inflation. Essentially, employers can be spared the necessity of cutting workers’ nominal wage when these workers’ productivity falls below their real wage. While Akerlof, Dickens, and Perry make an empirical case for their views, others find little evidence that labor markets become less efficient when inflation drops to very low levels. While economists are still debating these issues, from a pragmatic standpoint many countries do seem to be gravitating toward a consensus on how inflation targeting should work. All inflation targeting developed countries target a low positive rate, and emerging-market countries aspire to this kind of target. No country targets for deflation. Most countries target inflation in a flexible and forward-looking manner. Most countries have roughly similar policies toward openness, commitment, and explanation. If whether the target should be a low positive number or zero is all there is to argue about, that disagreement certainly ranks low on the intensity scale of policy disputes. Theoretical pros and cons From a theoretical perspective, one might think that inflation targets would be most valuable to countries with a history of bad inflation. These countries’ central banks need credibility and have two basic ways to get it. One is to peg their exchange rate to some hard currency and essentially tie the hands of their central bank. Currency boards, dollar pegs, and dollarization are all examples of such policies. The second route is to adopt an inflation target and to stick to it. If the prior inflation is very bad, as it often is in emerging-market countries, these targets might have to start at a high level and be worked gradually down as the central bank brings inflation under control. Although inflation targeting is usually described as an antidote to past inflationary binges, it has also been suggested as a cure to potential deflation. Krugman (1998), for example, has argued that Japan, with nominal interest rates stuck at their floor of zero, can lower real interest rates and stimulate investment by having the Bank of Japan target a positive rate of inflation and do what it can to hit that target. Inflation targeting regimes might also work well when a country undergoes what is known as a productivity shock. Suppose a wave of innovations makes productivity rise, pushing up output and lowering unit labor costs. For a time this shock might be reflected in higher-than-trend rates of growth of output and lower-than-trend rates of unemployment, making it difficult to rely on normal indicators of demand and supply growth in the conduct of monetary policy. In such circumstances, a cautious central bank might well just wait for signs of inflation to emerge, thwart the inflation if it occurs, and not rely as heavily on normal measures of aggregate demand growth or labor market tightness. Such a central bank would, in effect, be following an inflation targeting regime. Although liberals in general have been very critical of inflation targeting (Galbraith, 1999), in this case inflation targeting would lead to exactly the type of monetary policy they would favor. But in some cases, inflation targeting might not work out so well. One case is plain old recessions. Suppose there were a recessionary shock to aggregate demand. Because inflation normally responds slowly to such shocks, inflation targeters could respond in any of three ways. Strict inflation targeters, sometimes snidely called inflation nutters, might sit idly by and let the recession happen. Or, if inflation fell below target ranges, some central banks might take steps to boost inflation by expansionary monetary policy. They would clearly do this if deflation threatened, but they might do it even with low positive rates of inflation below target ranges. The third possible response involves flexible and forward-looking inflation targeting as is actually practiced in most countries. Because inflation usually responds slowly to output changes in recessions, flexible inflation targeting regimes would be free to ease policy to stabilize output, much as would non-targeting central banks. Forward-looking central banks could even act affirmatively against recessions to prevent future inflation from falling below its target range. Svensson (1999) argues that such a policy strategy clearly outperforms other monetary regimes. But even here the flexibility to be forward-looking and to pursue other goals is less than a commitment of the central bank to try to stabilize output or promote full employment. The exact importance of these other objectives remains in question, even for flexible and forward-looking inflation targeting regimes. Other instances in which inflation targeting might not work so well are negative supply shocks, such as most economies experienced in the mid-1970s when oil prices exploded. In these times, inflation rises just as output falls. The most flexible and competent central bank in the world would be faced with a difficult dilemma in such circumstances - forestall the recession by making inflation worse or limit the inflation by making the recession worse. But at least such a central bank would have a choice. In general, an inflation targeting central bank would not have much of a choice. It would be forced to try to limit the inflation by contractionary policies, hence making the recession worse. Even a flexible, forward-looking inflation targeting central bank would not have much freedom in such a situation, because in the end the central bank would be evaluated much more on its success in meeting inflation targets than in meeting output growth targets. Hence, inflation targeting does not appear to solve all the problems central banks might face and cannot be prescribed as a panacea. It still might be a reasonable policy strategy for most purposes, and it still seems to be generally the proper approach for dealing with histories of inflation or deflation and, perhaps, with productivity shocks. Actual experience Theoretical arguments aside, many countries have used inflation targeting for most of the 1990s. Hence we can do more than theorize: we can actually look at the inflation targeting experience in several countries and see how it has worked out. Various authors have done this in two ways. They have compared a country’s post-inflation targeting history with its pre-inflation targeting history, and they have compared outcomes in inflation targeting countries with those in non-targeting countries. The time series studies have focused mainly on the three countries that have had the longest experience with inflation targeting - New Zealand (adopted in 1990), Canada (1991), and the United Kingdom (1992). According to the simple numbers, once these countries adopted inflation targeting, actual inflation has fallen in each country, and nominal interest rates have fallen, suggesting lower inflation expectations. Real measures, such as the growth in output or unemployment, have either shown little change or worsened only slightly. Generally, unemployment rose as a country disinflated and then returned to its former average level, but sometimes not all the way there. Looking behind these simple numbers, a number of authors have done more sophisticated econometric tests. Ammer and Freeman (1995) estimated VAR models for real GDP, price levels, and real interest rates up to the adoption of a targeting regime and then simulated these models into the targeting era, comparing simulated values with actual values. They found that inflation fell below predictions in all three countries. Real GDP dipped down and then recovered in New Zealand and the United Kingdom but dipped down and only partly recovered in Canada. A subsequent analysis by Freeman and Willis (1995) focused more intensely on interest rates. The authors noted that long-term nominal rates fell in all three countries following the adoption of inflation targeting but then came back up in the mid-1990s. The latter rises could either indicate that inflation targeting regimes became less credible or simply reflect the fact that world interest rates were rising at this time. Freeman and Willis worked out a model to disentangle the two effects and put most of the explanation for rising long-term nominal rates on the behavior of world interest rates, hence suggesting that inflation targeting regimes remained credible. A more recent set of authors conducted similar tests. Mishkin and Posen (1997) noted that all three inflation targeting countries reduced inflation before adopting a formal targeting regime. The achievement of inflation targeting, then, was to lock in the gains of earlier fights to stabilize prices. The authors also estimated VAR equations up to the adoption of inflation targeting and simulated these equations into the targeting period, now for six years. Just as in the earlier analysis of Ammer and Freeman, this analysis suggested that in all three countries the inflation targeting led to a drop in inflation and nominal interest rates. In Canada, the rate of growth of real GDP was down slightly; in the other two countries, there was no change in the rate of growth of real GDP. Similar results were found by Kahn and Parrish (1998). Despite the fact that inflation had dropped in all three countries before the adoption of inflation targeting, these authors observed upward inflationary blips in New Zealand and Canada; so perhaps the achievement of keeping inflation under control should not be taken for granted. Their results were buttressed by those of Kuttner and Posen (1999), whose VAR regressions for the United Kingdom found that inflation persistence was reduced by inflation targeting, as measured by inflation itself and by nominal interest rates. In Canada, there was no inflation persistence before or after inflation targeting as measured by inflation rates, though again targeting reduced inflation persistence as measured by nominal interest rates. In New Zealand, the results were the opposite - targeting reduced persistence as measured by inflation rates but seemed to increase it slightly as measured by nominal interest rates. The main cross-sectional study of countries was done by Cecchetti and Ehrmann (1999). They noted that the decade of the 1990s, when many countries went to inflation targeting, was a good one for economic outcomes: many monetary regimes tried in this decade are likely to look good. In their formal work, these authors fit VAR models for twenty-three countries, nine inflation targeters and fourteen non-targeters. From these models, they deduced policymakers’ aversion to inflation volatility. They found that inflation aversion rose in countries that adopted inflation targeting but only to the level of aversion already apparent in the policies of the non-targeting countries. At this point, there is very little difference between aversion to inflation in countries that target and countries that do not target inflation. Taken together, the basic data, the time series tests, and the cross-section tests indicate that inflation targeting has seemed to succeed. Inflation has dropped materially in the three countries with the longest experience with the regime, and all inflation targeting countries are still content with inflation targeting, in some cases eight to ten years after its adoption. Measures of inflation persistence also have dropped. A seeming weakness of inflation targeting is in its response to unemployment, but at this time one can find little evidence that unemployment has worsened in targeting countries. At the same time, inflation targeting has been adopted in the 1990s, a good decade for economic outcomes in most countries. It is unclear how inflation targeting would look in more difficult economic circumstances such as the 1970s. Is inflation targeting right for the United States? The hidden question in all this, of course, is whether the United States should go to a regime of explicit inflation targeting. I am not going to try to answer that question but will make several points. First, the question of whether the United States does or does not adopt a formal inflation targeting regime is not up to the Federal Reserve. The Federal Reserve Act now requires the Fed to strive for maximum employment and balanced growth, along with price stability and moderate long-term interest rates. Until the Congress changes these guidelines, the Fed will continue to pursue these goals. Second, the Federal Reserve is strongly committed to controlling inflation, however formally this goal is specified in the Fed’s mandate. This can be seen from both words and deeds. Countless official pronouncements and testimony affirm the importance of controlling inflation. As for actions, at least Cecchetti and Ehrmann find that the Fed’s revealed inflation aversion is now as high as that of the formal inflation targeting countries. Given this strong inflation aversion, ultimately there may be little difference between informal inflation targeting as practiced in the United States and flexible, forward-looking inflation targeting as practiced in many other countries around the world. That said, one could still ask the normative question of whether the United States should go to what I will call a more formal system of inflation targeting. Such a system would have pluses and minuses. One potential plus is in credibility and transparency. Even if the present-day pragmatic Fed responds to exogenous rises in the growth of aggregate supply or drops in the non-inflationary rate of unemployment in a fully accommodative manner, inflation targeting may better communicate the strategy. For example, explicit inflation targeting statements may help to make it clear that the Fed is really fighting inflation, not economic growth. But there are also potential disadvantages. Economic circumstances have been good in the 1990s, when countries have gone over to inflation targeting, and it is worth repeating that inflation targeting is no panacea. It may not work well in the presence of negative supply shocks like those experienced throughout the world in the 1970s. Moreover, there is a potential problem with inflation targeting even in good economic times. If forecasting inflation is difficult, even forward-looking inflation targeting central banks may respond to inflationary shocks too late to ward off inflation. Although there are several ways to forecast inflation, none may be that reliable. On one side, many analysts use econometric models, but these may have intrinsic problems in periods of significant structural shifts. The very nature of such shocks is that they are not easy to predict or model. On the other side, one could imagine constructing leading indicators of inflation, but the experience until now is that not many of these are reliable either. One could also rely on market expectations of inflation, survey evidence, or other forecasts of inflation. But if models are not working well and there are not many reliable leading indicators, it is not clear how much information is contained in these other forecasts. Without models or leading indicators, even forward-looking inflation targeting strategies may not work as well as advertised. Conclusion Inflation targeting has many things going for it. This strategy of conducting monetary policy has been widely adopted around the world, and it has seemed to be successful in lowering inflation and perceptions of future inflation. It has a potential drawback in ignoring explicit consideration of output gaps and unemployment, but perhaps because it has been applied flexibly and in a forward-looking manner, in fact it has not seemed to generate more unemployment than other monetary regimes would have. It also may not work as well in times of negative supply shocks, though this point remains to be tested. For the United States, given the strong aversion to inflation already apparent in policy responses, there are various pros and cons, but it is not obvious that a more formal regime of inflation targeting will lead to very great differences in actual monetary policies. References Akerlof, George A, William T Dickens, and George L Perry, “The Macroeconomics of Low Inflation,” Brookings Papers on Economic Activity, 1, 1-76. Ammer, John, and Richard T Freeman, “Inflation Targeting in the 1990s: The Experiences of New Zealand, Canada, and the United Kingdom,” Journal of Economics and Business, 47:165-192. Bernanke, Ben S, Thomas Laubach, Frederic S Mishkin, and Adam S Posen, Inflation Targeting: Lessons from the International Experience, Princeton University Press. Cecchetti, Stephen G, and Michael Ehrmann, “Does Inflation Targeting Increase Output Volatility? An International Comparison of Policymakers’ Preferences and Outcomes,” National Bureau of Economic Research Working Paper #7426. Freeman, Richard T, and Jonathan L Willis, “Targeting Inflation in the 1990s: Recent Challenges,” Federal Reserve Board International Finance Discussion Paper #525. Galbraith, James K, “The Inflation Obsession: Flying in the Face of Facts,” Foreign Affairs, January-February, 78, 152-156. Kahn, George A, and Klara Parrish, “Conducting Monetary Policy with Inflation Targets,” Federal Reserve Bank of Kansas City Economic Review, 3rd Quarter, 5-32. King, Mervyn, “Challenges for Monetary Policy: New and Old,” New Challenges for Monetary Policy, Federal Reserve Bank of Kansas City, forthcoming. Krugman, Paul R, “It’s Back: Japan’s Slump and the Return of the Liquidity Trap,” Brookings Papers on Economic Activity, 2:1998, 137-206. Kuttner, Kenneth N, and Adam S Posen, “Does Talk Matter After All? Inflation Targeting and Central Bank Behavior,” New York Federal Reserve Bank, mimeo. McCallum, Bennett T, “Robustness Properties of a Rule for Monetary Policy,” Carnegie-Rochester Conference Series on Public Policy, 29, 173-204. Mishkin, Frederic S, and Adam S. Posen, “Inflation Targeting: Lessons from Four Countries,” Federal Reserve Bank of New York Economic Policy Review, August, 9-110. Svensson, Lars E O, “How Should Monetary Policy Be Conducted in an Era of Price Stability?” New Challenges for Monetary Policy, Federal Reserve Bank of Kansas City, forthcoming. Taylor, John B, “Discretion versus Policy Rules in Practice,” Carnegie-Rochester Conference Series on Public Policy, 39, 195-214.
|
board of governors of the federal reserve system
| 2,000 | 1 |
Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the National Economists Club and the Society of Government Economists, Washington, D.C. on 20 January 2000.
|
Mr Meyer gives his views on the sustainability of growth and on monetary policy in the United States Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the National Economists Club and the Society of Government Economists, Washington, D.C. on 20 January 2000. * * * Economic performance over the past several years has been exceptional. The economy is about to set a record for the longest expansion. Economic growth has been proceeding at a rate that is close to double what was generally viewed as the long-term trend when this expansion began. Inflation remains modest, despite a decline in the unemployment rate to levels that many would have expected to trigger a significant acceleration in prices. But if you are a forecaster or a policymaker, rather than an economic historian, you must focus on the next chapter. The fundamental question today, it seems to me, is whether the current set of macroeconomic conditions - specifically, the growth of output and the unemployment rate - is sustainable - that is, consistent with stable, low inflation. If it is, the expansion could continue on its current path, unless disturbed by some shock or policy mistake. Otherwise, the challenge for monetary policy is to guide the economy to a sustainable path while preserving low inflation. This challenge is heightened by the unusual degree of uncertainty about the limits of capacity and potential growth, related in part to ongoing structural changes in the economy. Concerns have also been raised about potential imbalances in some sectors, for example, about the sustainability of equity prices, the personal saving rate, the current account deficit, and debt burdens. These concerns about sustainability and the challenges they pose for monetary policy are the focus of my remarks this afternoon. Let me remind you that, as always, the views I express are my own. I am not speaking on behalf of either the Board of Governors or the Federal Open Market Committee (FOMC). Varieties of landings In figure 1, I present four plausible scenarios for future growth. Each has a quite different implication for both the outlook and policy. In each case the economy faces an initial gap between actual output (the dashed line) and potential output (the solid line). Such a gap is typical of conditions that follow a recession. During the expansion phase, at least for a period, growth in production typically exceeds the growth in capacity, so that the gap between actual and potential output is closed gradually. For the moment, I shall focus exclusively on the concerns about sustainability related to the balance between aggregate supply and demand. Later I shall turn to concerns about imbalances in equity prices, the personal saving rate, the current account, and the household debt burden. A soft landing Figure 1.A depicts a soft landing scenario, the graceful transition from the initial output gap to a sustainable growth path at full employment. A soft landing occurs if, as the level of output approaches potential, the growth of actual output slows to the growth of potential output just as actual output reaches potential. The line for potential output is, in effect, the runway. The line for actual output is like the path of a plane coming in for a soft landing - at least if you stand on your head while looking at the chart! Steady inflation is generally one of the signs of sustainability. A simple model of inflation dynamics is that changes in inflation are induced by excess aggregate demand or supply, as reflected in the output gap. When actual output rises above potential, according to this model, the resulting excess aggregate demand leads to rising inflation. In the short run, supply shocks - including both relative price shocks and changes in trend productivity - can also affect inflation dynamics. But these effects are temporary. Ultimately, inflation dynamics will be driven by the output gap. At some point in any expansion, therefore, a soft landing is the preferred path to preserve a healthy expansion. Such a slowdown in growth is desirable because the alternative is higher inflation - indeed, continually rising inflation. Looking back, most recessions have resulted from attempts by the policy authorities - yes, the Fed - to reverse increases in inflation generated by overheating. A reverse soft landing Figure 1.B depicts an alternative scenario in which the above-trend growth in the expansion phase has moved the economy beyond the point of its sustainable capacity to produce. Policymakers in this case failed to execute a soft landing. What could they do in this case to best ensure the continuation of a healthy expansion? The answer is to engineer the closest possible approximation to a soft landing, one in which we glide to potential beginning from a position initially above rather than from below potential. Because the convergence to potential is from above rather than below, I call this a “reverse” soft landing. In this case, because the economy is already operating beyond its sustainable capacity, the economy may not be able to avoid some acceleration in inflation as policymakers try to engineer the soft landing. Therefore, the return to the potential output path has to be achieved in a sufficiently timely fashion to minimize any increase in inflation during the transition. Also, in the reverse soft landing case, growth must slow, not just to trend but to below trend in order to close the output gap. As a result, the unemployment rate must rise during the transition to full employment in this case. Hence, whereas the soft landing outcome in figure 1.A involves a stabilization of the unemployment rate at its low point and of inflation near its recent low, in the reverse soft landing case depicted in figure 1.B, both inflation and unemployment rates are likely to rise during the transition. The best-case scenario: supply meets demand The first two scenarios I have described assume that demand has to adjust to a steady supply path to achieve sustainability. There are natural equilibrating mechanisms, as well as policy adjustments, that encourage such adjustment in demand relative to supply. An alternative scenario is that supply adjusts to demand. This case is depicted in figure 1.C in which the growth of potential increases just as output threatens to push beyond potential. In this scenario, the runway has fortuitously landed on the plane, as it were. If this scenario describes the current episode, then the economy can continue to grow at 4%, the unemployment rate can remain near 4%, and inflation can remain steady at its prevailing rate. This scenario may seem farfetched. But it has the advantage of incorporating the role of supply-side as well as demand-side forces that appear to be at work in this expansion, and it would at least help explain the stability of inflation at prevailing growth and unemployment rates. Indeed, a possible decline in the NAIRU and, especially, an increase in the growth rate of potential output appear to be essential elements of this expansion. Still, even if we take into account the supply-side changes, we should not expect a perfect balancing between supply and demand. The worse-case scenario: hard landings The worst-case alternative to a soft landing, to continue the analogy, is often referred to as a hard landing. Despite, or perhaps because of, the recent exceptional performance of the US economy, some see a growing danger of a hard landing. A hard landing might seem like an oxymoron. A crash is a crash, after all, not a landing. But the key to a landing is that it is required to ensure sustainability of an expansion. A soft landing is the preferred course to ensuring a healthy, sustainable expansion, if it can be executed. A reverse soft landing is the second-best option. Otherwise, a hard landing may be unavoidable, despite the best efforts of policymakers. The upside of a hard landing is that it contributes to reversing imbalances and, afterwards, allows policymakers to aim once again at achieving a healthy, sustainable expansion, ultimately combining full employment, maximum sustainable growth, and price stability. It is useful to distinguish two broad classes of hard landings. The first involves the reversal of an imbalance between aggregate supply and aggregate demand. The classic example is the boom-bust scenario. The second class involves the unwinding of sector or market imbalances that either initiate a downturn in the economy or aggravate a downturn that would otherwise have occurred. A classic example of this genre is a stock market correction. I will focus first on the boom-bust scenario to complete my classification of paths to sustainable combinations of growth and the output gap. In the boom-bust scenario, depicted in figure 1.D, above-trend output growth during the expansion ultimately pushes output well beyond potential for a persistent period. The resulting overheating puts upward pressure on inflation. The monetary policy response to reverse the inflation often yields a decline in output, as depicted here, resulting in a period of economic slack and a reversal of the rise in inflation. This is the scenario from which we draw the lesson that timely, typically preemptive, policy restraint to avoid the excesses of a boom results in longer expansions and avoids unnecessary fluctuations in both output and inflation. Where are we relative to potential and do we need to land? To identify whether the initial conditions today correspond to those in one of the panels of figure 1, we have to assess where output is relative to its potential and whether growth is above or below trend. This assignment is more difficult than usual because structural changes of uncertain dimension may have raised both the level and the growth rate of potential output. There is, for example, a consensus that the NAIRU has declined since the early 1990s. That decline translates into an increase in the level of potential output at any given time or a decline in the output gap for a given level of output. There is also a consensus that the rate of growth of potential output is higher today than during the twenty years preceding this expansion. However, the degree to which these two parameters have changed is not a settled issue. In particular, some think that the rapid growth and low unemployment rate of the past two years represent the economy’s new equilibrium (case 1.C), but others believe that the economy has still been running ahead of potential recently, despite structural changes. We should not be surprised that a period of structural change would also be one of heightened uncertainty about key parameters, such as the NAIRU and trend growth. Table 1 offers various estimates of the NAIRU and trend growth, drawn from researchers, model-based forecasts, assumptions incorporated in government budget projections, and surveys of economic forecasters. These estimates suggest that actual output is above potential (the unemployment rate is below the NAIRU) and that actual output growth has been above trend growth of potential. Despite the uncertainties, the consensus estimates of the NAIRU and the growth of potential give us a hint about what type of landing we should be aiming for and which of the scenarios depicted in figure 1 best describe the economy’s initial conditions and prospects. The answer, it seems to me, is that no scenario in figure 1 does justice to the complex forces that have been in play during this expansion. I believe that the prevailing macro configuration is best described by some combination of figures 1.B, 1.C, or 1.D. That is, even after we incorporate the estimated decline in the NAIRU and the higher rate of growth of potential (as reflected in figure 1.C), output is above potential and output growth exceeds that of potential (as depicted in figures 1.B and 1.D). If output were above potential, we still would not know whether the outcome would be a soft (1.B) or a hard (1.D) landing. To complete the picture, we also have to rely on the temporary disinflationary effects of favorable relative-price shocks and the increase in the productivity trend that have allowed the economy to operate, for a while, beyond potential without suffering inflationary consequences. Hard landings associated with the unwinding of sector and market imbalances Much of the recent concern about the sustainability of this expansion is, nevertheless, not related directly to the balance between aggregate demand and aggregate supply. Rather these concerns are related to perceived imbalances in particular sectors or markets. Most notably, attention has focused on equity prices, the personal saving rate, the current account deficit, and debt burdens. The unifying theme among this class of imbalances is that they typically arise during an expansion, often as the result of changing attitudes toward the perceived risk in the economy or as a result of increased willingness to accept risk. Many, though not all, of these imbalances are financial in nature - for example, increases in leverage or declines in liquidity and other margins of safety. These developments typically play a role in supporting or financing expansions. The resulting imbalances do not typically induce a downturn by spontaneously reversing. But they may act to magnify any downward forces that hit the economy, increasing the depth and perhaps duration of downturns. Hence, these factors play an important role in both phases of the boom-bust scenario. I associate many of the second class of hard landing scenarios with the work of a former colleague and friend, Hyman Minsky, who died in 1996. He emphasized the development of financial vulnerabilities in expansions and their contribution to serious recessions. In his view, serious recessions are typically the result of a coincidence of adverse shocks on an already vulnerable economy. Minsky emphasized the role of vulnerabilities arising from financial imbalances, including excessive debt burdens or increases in the price of risky assets relative to safe assets. Historical perspective on market or sector imbalances Figure 2 offers a historical perspective on equity prices, the personal saving rate, the current account, and debt burden. In each case, I identify a preferred measure of each variable, scaling it relative to an appropriate measure of output or income. I want to emphasize that we cannot reach a judgment from 12/99 1/00 1/00 1/00 1/00 National Assoc. of Business Economists Macroeconomic Advisers DRI Robert Gordon Downtown Economists Club 4.5 5.1 5-1/4 5.5 4.9 5.6 NAIRU (percent) 3.3 3.5 3.2 3.5 3.2 3.1 Potential GDP growth (percent) Median of 15 forecasts. NAIRU varies according to Kalman-filter estimate from an equation based on the PCE deflator; equation is estimated through 1998, with some judgmental decline in the NAIRU since then. Long-run NAIRU concept. Short-run concept that includes supply shocks is 4-1/2 percent. NAIRU varies with demographics. Potential GDP growth varies year to year with capital deepening, etc. Median of 35 forecasts. NAIRU estimates were collected in September 1998. NAIRU varies with demographics. Potential GDP growth varies year to year with capital deepening, etc. Updated estimates will be available January 26. Comments 1. The CBO estimate of GDP is on a pre-benchmark-revision basis. Given that the revisions raised GDP growth during the 1990s, this figure likely will be revised up on January 26. 6/99 Date of estimate Congressional Budget Office Source Estimates of NAIRU and Potential GDP Growth for 1999 Table 1 these charts about whether the perceived imbalance is real and serious, but we can at least understand why concerns have been raised in each case. The chart for the stock market, figure 2.A, shows the price-to-earnings ratio for the S&P500 index, based on the trailing four-quarter earnings. The current p/e ratio of about 32 compares with an average of 16 since 1957 and a high before this expansion of 22.3 in August 1987. The personal saving rate, charted in figure 2.B, has declined in this episode to a record low. The current account balance, pictured in figure 2.C, is measured as the ratio to nominal gross domestic product (GDP). This ratio has also declined to a record low. In figure 2.D, I have charted the ratio of debt service costs to disposable income for the household sector, a preferred measure of the household debt burden. It has been rising since the mid-1990s but remains below the peak reached in the mid-1980s. As I noted, none of these diagrams definitively demonstrate that there is an unsustainable imbalance. The point of the exercise is to show why some have worried that there might be. It would take a more detailed analysis than time permits to reach an informed judgment about the risks in each case. And when we were done with this more detailed analysis, we could reasonably expect that we would still be left with considerable uncertainty. Common sources of recent developments What I do want to focus on this afternoon are possible common sources for the developments pictured in figure 2 and the relation of any imbalance between aggregate supply and demand to those developments. There are, I believe, some common sources of the developments pictured in figure 2. First, these variables are all cyclically sensitive. During expansions, equity prices tend to rise, although they often decline before a downturn in the economy. Discerning a consistent pattern for the saving rate during an expansion from the chart is more difficult: too many other factors play a role. But regression analysis indicates that the saving rate tends to move countercyclically. The current account balance tends to deteriorate if the expansion in the United States outpaces that abroad, as has been the case in recent years. After some point, the debt burden tends to increase sharply during expansions, although it often turns before a recession. But this cyclical expansion is not ordinary. It is exceptional. The unemployment rate, for example, has declined to a 30-year low. By some estimates, the output gap is the widest since the early-1970s. The duration of the expansion is about to set a record. It is therefore not surprising that cyclically sensitive variables are behaving exceptionally by historical standards. Second, the composition of output gains in this episode has also contributed to the patterns in figure 2. Private domestic demand typically is the driver of expansions, but its contribution has been even greater than usual this time. The direct contribution of federal government spending and tax changes, reflected in the swing in the federal budget from deficit to surplus, has been a net drag on growth. The weakness of our trading partners and the crises among emerging market economies contributed to the sharpness of the decline in net exports. So the pace of domestic private demand has been even stronger than the growth in overall output. Private domestic spending has been driven, in part, by the wealth effect arising from higher equity prices, a situation that also helps to explain much of the decline in the personal saving rate, and has been financed by a higher household debt and by the tapping of foreign saving. Also the two types of imbalances - an imbalance between aggregate demand and aggregate supply and sector or market imbalances - could be connected. Consider a situation in which growth is above trend and output moves beyond capacity. If investors misread these developments as sustainable and, therefore, extrapolate the exceptional conditions, exceptional and perhaps unsustainable movements in equity prices, the saving rate, and debt burden might be encouraged. Alternatively, a rise in equity prices that outstrips fundamentals might contribute to a pace of private domestic demand that ultimately takes output beyond capacity; in this case, the market or sector imbalance would be what contributed to the aggregate demand-supply imbalance. It also seems quite possible, indeed likely, that both these directions could operate simultaneously and reinforce each other. Finally, market or sector imbalances could possibly rise to worrisome proportions in the absence of an imbalance between aggregate demand and supply. This analysis leaves us with four possible combinations: (1) simultaneous imbalances in both aggregate demand/supply and market/sector variables; (2) simultaneous balance in each class; (3) aggregate demand/supply imbalance accompanied by balance in market/sector variables; and (4) aggregate demand/supply balance accompanied by market/sector imbalances. There are clearly a wide variety of opinions about which of these combinations best describes the current situation. Indeed, many observers, including myself, are uncertain about which combination best fits the current picture. The problem in assessing the risks associated with market/sector imbalances is not only determining whether or not prevailing levels of these variables constitute an imbalance in the first place but also figuring out the circumstances and time frame over which any true imbalance might be unwound. In addition, the effects on the economy as a given imbalance is unwound will depend importantly on interactions with other imbalances and with events that trigger the unwinding of the imbalance, as well as on the policy response. For example, a stock market correction is typically triggered by some adverse event so that the effect on the economy will be the combined effects of the triggering event and the stock market decline. A decline in the stock market might also, for example, reduce confidence in the US economy and reduce the willingness of foreigners to accumulate the increment in US liabilities associated with the current account deficit. And the net effect will also depend on the policy response to the adverse effects of the unwinding of any imbalances. As a result of these considerations, simple multiplier exercises, such as the effect on the economy of a given percentage decline in equity prices, probably tell more about the econometric model used than about prospects for the economy. The challenge for monetary policy The most important of the perceived imbalances I have discussed today is, in my view, the possibility of an overheated economy. In three of the four combinations of aggregate demand/supply and market/sector imbalances, it seems to me that the best approach would be to focus directly on the aggregate demand/supply imbalance and allow the indirect effects of such a policy to mitigate any other imbalances. In addition, any other imbalances are more likely to grow to worrisome proportions during an unsustainable boom and are more likely to unwind in a disruptive manner if confronted by rising inflation, sharply higher interest rates in response to higher inflation, and a subsequent recession. As a result, my guess is that if we avoid the boom-bust scenario, we shall have avoided the most serious of the other imbalances or at least will be in a better position to absorb and respond to the unwinding of other possible imbalances. That leaves the possibility that there might be cases when we face market/sector imbalances in the absence of any aggregate demand/supply imbalance. In such a case, the level and growth of output are sustainable in the sense that they are not putting pressure on inflation; but this aggregate balance might be threatened subsequently by a spontaneous unwinding of a market/sector imbalance. Alternatively, the depth and duration of a downturn in response to some future adverse shock might be aggravated by the unwinding market/sector imbalances. What role can and should monetary policy play in such a case? Policymakers will, I expect, be reluctant to undermine macroeconomic performance in the short run in an attempt to unwind a perceived market/sector imbalance that might not be serious or might unwind in a gradual and nondisruptive fashion on its own. Furthermore, it is not obvious how to unwind an excessive debt burden, to raise the personal saving rate, or to narrow the current account deficit in a sustainable way through monetary policy. As a result, monetary policy, in my view, needs to focus on achieving balance between aggregate supply and aggregate demand. In pursuing this course, monetary policy is confronted by two competing challenges. The first is to allow the economy to realize the benefits of any decline in the NAIRU and any increase in trend growth. Supporting maximum sustainable growth is very much the business of monetary policy. But achieving maximum sustainable growth also is about ensuring the sustainability of an expansion and hence avoiding overheating. This is the second challenge today. I view the efforts of the FOMC as precisely focused on balancing these considerations.
|
board of governors of the federal reserve system
| 2,000 | 1 |
Testimony of Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Committee on Agriculture, Nutrition and Forestry, United States Senate, on 10 February 2000.
|
Mr Greenspan gives a testimony on over-the-counter derivatives Testimony of Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Committee on Agriculture, Nutrition and Forestry, United States Senate, on 10 February 2000. * * * I am pleased to be here today to underscore the importance of this committee’s efforts to modernize the Commodity Exchange Act (CEA) and to express my support for the recommendations for amending the act that were contained in the report by the President’s Working Group on Financial Markets entitled Over-the-Counter Derivatives Markets and the Commodity Exchange Act. The need for legislation Over-the-counter (OTC) derivatives have come to play an exceptionally important role in our financial system and in our economy. These instruments allow users to unbundle risks and allocate them to the investors most willing and able to assume them. A growing number of financial and non-financial institutions have embraced derivatives as an integral part of their risk capital allocation and profit maximization. In particular, the profitability of derivative products has been a major factor in the significant gain in the finance industry’s share of American corporate output during the past decade - a reflection of their value to non-financial industry. Indeed, this value added from derivatives itself derives from their ability to enhance the process of wealth creation throughout our economy. In light of the importance of OTC derivatives, it is essential that we address the legal uncertainties created by the possibility that courts could construe OTC derivatives to be futures contracts subject to the CEA. The legal uncertainties create risks to counterparties in OTC contracts and, indeed, to our financial system that simply are unacceptable. They have also impeded initiatives to centralize the trading and clearing of OTC contracts, developments that have the potential to increase efficiency and reduce risks in OTC transactions. As I shall discuss more fully later in my remarks, rapid changes in communications technology portend that time is running out for us to modernize our regulation of financial markets before we lose them and the associated profits and employment opportunities to foreign jurisdictions that impose no such impediments. To be sure, the Congress and the Commodity Futures Trading Commission (CFTC) have taken steps to address these concerns about the CEA. The Futures Trading Practices Act of 1992 gave the CFTC authority to exempt OTC derivatives from most provisions of the act. In early-1993 the CFTC used that authority to create an exemption for OTC derivatives that reduced legal uncertainty for a wide range of transactions and counterparties. Unfortunately, some subsequent actions by the Commission called into question market participants’ understanding of the terms of the 1993 exemption. Now, under the leadership of Chairman Rainer, the Commission is considering reaffirming and expanding the terms of the 1993 exemption. Nonetheless, even with such an important and constructive step by the Commission, legislation amending the CEA would remain critically important. The greatest legal uncertainty affecting existing OTC transactions is in the area of securities-based contracts, where the CFTC’s exemptive authority is constrained. Furthermore, as events during the past few years have clearly demonstrated, regulatory exemptions, unlike statutory exclusions, carry the risk of amendment by future Commissions. Principles of regulation Imposing government regulation on a market can impair its efficiency. Thus, when evaluating the need for government regulation, one must clearly identify the public policy objectives of the regulation. As the working group’s report discusses, the primary public policy purposes of the CEA are to deter market manipulation and to protect investors against fraud and other unfair practices. We must of course assess whether government regulation is necessary to achieve those objectives. The regulatory framework of the CEA was designed for the trading of grain futures by the general public, including retail investors. Because quantities of grain following a harvest are generally known and limited, it is possible, at least in principle, to manipulate the price of grain by cornering a market. Furthermore, grain futures prices are widely disseminated and widely used as the basis for pricing grain transactions off the futures exchanges. The fact that grain futures serve such a price-discovery function means that if attempts to corner a market result in price fluctuations, the effects would be felt widely by producers and consumers of grain. OTC derivatives The President’s working group has considered whether regulation of OTC derivatives is necessary to achieve these public policy objectives of the CEA. In the case of financial OTC derivatives transactions between professional counterparties, the working group has agreed that such regulation is unnecessary and that such transactions should be excluded from coverage of the act. Importantly, the recommended exclusion would extend to those securities-based derivatives that currently are subject to the greatest legal risk from potential application of the CEA. The rationale for this position is straightforward. OTC transactions in financial derivatives are not susceptible to - that is, easily influenced by - manipulation. The vast majority of contracts are settled in cash, based on a rate or price determined in a separate highly liquid market with a very large or virtually unlimited deliverable supply. Furthermore, prices established in OTC transactions do not serve a price-discovery function. Thus, even if the price of an OTC contract were somehow manipulated, the adverse effects on the economy would be quite limited. With respect to fraud and other unfair practices, the professional counterparties that use OTC derivatives simply do not require the protections that CEA provides for retail investors. If professional counterparties are victimized, they can obtain redress under the laws applicable to contracts generally. The working group also considered whether the introduction of centralized mechanisms for the trading and settling of what heretofore have been purely bilaterally negotiated and settled transactions would give rise to a need for additional regulation. In the case of electronic trading systems, the working group concluded that regulation under the CEA was unnecessary and that such systems should be excluded from the act, provided that the contracts are not based on non-financial commodities with finite supplies and that the participants are limited to sophisticated counterparties trading solely for their own accounts. Electronic trading of such contracts by such counterparties, it was reasoned, would be no more susceptible to problems of manipulation and fraud than purely bilateral transactions. It was suggested that some limited regulation of such systems might become necessary in the future if such trading systems came to serve a price-discovery function. But it was agreed that creation of a regulatory system for such systems in anticipation of problems was inappropriate. As I have already noted, the vast majority of OTC derivatives simply are not susceptible to manipulation. Thus, even if those contracts come to play a role in price discovery, regulation of the trading mechanism might still be unnecessary. In the case of clearing systems for OTC derivatives, the working group concluded that government oversight is appropriate. Clearing tends to concentrate risks and responsibilities for risk management in a central party or clearinghouse. Consequently, the effectiveness of the clearinghouse’s risk management is critical for the stability of the markets that it serves. Depending on the types of transactions cleared, such oversight might appropriately be conducted by the CFTC under the CEA. Alternatively, it might be conducted by the Securities and Exchange Commission, the Federal Reserve, the Office of the Comptroller of the Currency, or a foreign financial regulator that one of the US regulators has determined satisfies appropriate standards. Provided such government oversight is in place, OTC transactions that would otherwise be excluded from the CEA should not fall within the ambit of the act because they are cleared. If market participants conclude that clearing would reduce counterparty risks in OTC transactions, concerns about legal risks associated with the potential application of the CEA should not stand in their way. Traditional exchanges The working group’s report does not make specific recommendations about the regulation of traditional exchange-traded futures markets that use open outcry trading or that allow trading by retail investors. Nevertheless, it calls for a review of the existing regulatory structures, particularly those applicable to financial futures, to ensure that they are appropriate in light of the objectives of the act. Consistent with the principles of regulation that I identified earlier, the report notes that exchange-traded futures should not be subject to regulations that are unnecessary to achieve the CEA’s objectives. The report also concludes that the current prohibition on single-stock futures can be repealed if issues about the integrity of the underlying securities market and regulatory arbitrage are resolved. I want to underscore how important it is for us to address these issues promptly. I cannot claim to speak with certainty as to how our complex and rapidly moving markets will evolve. But I see a real risk that, if we fail to rationalize our regulation of centralized trading mechanisms for financial instruments, these markets and the related profits and employment opportunities will be lost to foreign jurisdictions that maintain the confidence of global investors without imposing so many regulatory constraints. My concerns on this score stem from the dramatic advances in information technology that we see all around us. In markets with significant economies of scale and scope, like those for standardized financial instruments, there is a tendency toward consolidation or even natural monopoly. Throughout much of our history this tendency has been restrained by an inability to communicate information sufficiently quickly, cheaply, and accurately. In recent years, however, this constraint is being essentially eliminated by advances in telecommunications. We have not yet seen clear evidence of a trend toward natural monopoly. But the diffusion of technology often traces an S-shaped curve, first diffusing slowly, but then rapidly picking up speed. Once we reach the steep segment of that S-curve, it may be too late to rationalize our regulatory structure. Already the largest futures exchange in the world is no longer in the American heartland; instead, it is now in the heart of Europe. To be sure, no US exchange has yet to lose a major contract to a foreign competitor. But it would be a serious mistake for us to wait for such unmistakable evidence of a loss of international competitiveness before acting. As our experience with the vast eurodollar markets demonstrates, once markets with scale and scope economies are lost, they are very difficult, if not impossible, to recapture.
|
board of governors of the federal reserve system
| 2,000 | 2 |
Testimony of Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Committee on Banking and Financial Services, US House of Representatives on 17 February 2000.
|
Mr Greenspan presents the Federal Reserve’s semi-annual report on the economy and monetary policy to the US House of Representatives Testimony of Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Committee on Banking and Financial Services, US House of Representatives on 17 February 2000. * * * I appreciate this opportunity to present the Federal Reserve’s semi-annual report on the economy and monetary policy. There is little evidence that the American economy, which grew more than 4% in 1999 and surged forward at an even faster pace in the second half of the year, is slowing appreciably. At the same time, inflation has remained largely contained. An increase in the overall rate of inflation in 1999 was mainly a result of higher energy prices. Importantly, unit labor costs actually declined in the second half of the year. Indeed, still-preliminary data indicate that total unit cost increases last year remained extraordinarily low, even as the business expansion approached a record nine years. Domestic operating profit margins, after sagging for eighteen months, apparently turned up again in the fourth quarter, and profit expectations for major corporations for the first quarter have been undergoing upward revisions since the beginning of the year - scarcely an indication of imminent economic weakness. The economic forces at work Underlying this performance, unprecedented in my half-century of observing the American economy, is a continuing acceleration in productivity. Non-farm business output per workhour increased 3¼% during the past year - likely more than 4% when measured by non-farm business income. Security analysts’ projections of long-term earnings, an indicator of expectations of company productivity, continued to be revised upward in January, extending a string of upward revisions that began in early 1995. One result of this remarkable economic performance has been a pronounced increase in living standards for the majority of Americans. Another has been a labor market that has provided job opportunities for large numbers of people previously struggling to get on the first rung of a ladder leading to training, skills, and permanent employment. Yet those profoundly beneficial forces driving the American economy to competitive excellence are also engendering a set of imbalances that, unless contained, threaten our continuing prosperity. Accelerating productivity entails a matching acceleration in the potential output of goods and services and a corresponding rise in real incomes available to purchase the new output. The problem is that the pickup in productivity tends to create even greater increases in aggregate demand than in potential aggregate supply. This occurs principally because a rise in structural productivity growth has its counterpart in higher expectations for long-term corporate earnings. This, in turn, not only spurs business investment but also increases stock prices and the market value of assets held by households, creating additional purchasing power for which no additional goods or services have yet been produced. Historical evidence suggests that perhaps three to four cents out of every additional dollar of stock market wealth eventually is reflected in increased consumer purchases. The sharp rise in the amount of consumer outlays relative to disposable incomes in recent years, and the corresponding fall in the saving rate, has been consistent with this so-called wealth effect on household purchases. Moreover, higher stock prices, by lowering the cost of equity capital, have helped to support the boom in capital spending. Outlays prompted by capital gains in excess of increases in income, as best we can judge, have added about 1 percentage point to annual growth of gross domestic purchases, on average, over the past five years. The additional growth in spending of recent years that has accompanied these wealth gains as well as other supporting influences on the economy appears to have been met in about equal measure from increased net imports and from goods and services produced by the net increase in newly hired workers over and above the normal growth of the work force, including a substantial net inflow of workers from abroad. But these safety valves that have been supplying goods and services to meet the recent increments to purchasing power largely generated by capital gains cannot be expected to absorb an excess of demand over supply indefinitely. First, growing net imports and a widening current account deficit require ever larger portfolio and direct foreign investments in the United States, an outcome that cannot continue without limit. Imbalances in the labor markets perhaps may have even more serious implications for inflation pressures. While the pool of officially unemployed and those otherwise willing to work may continue to shrink, as it has persistently over the past seven years, there is an effective limit to new hiring, unless immigration is uncapped. At some point in the continuous reduction in the number of available workers willing to take jobs, short of the repeal of the law of supply and demand, wage increases must rise above even impressive gains in productivity. This would intensify inflationary pressures or squeeze profit margins, with either outcome capable of bringing our growing prosperity to an end. As would be expected, imbalances between demand and potential supply in markets for goods and services are being mirrored in the financial markets by an excess in the demand for funds. As a consequence, market interest rates are already moving in the direction of containing the excess of demand in financial markets and therefore in product markets as well. For example, BBB corporate bond rates adjusted for inflation expectations have risen by more than 1 percentage point during the past two years. However, to date, rising business earnings expectations and declining compensation for risk have more than offset the effects of this increase, propelling equity prices and the wealth effect higher. Should this process continue, however, with the assistance of a monetary policy vigilant against emerging macroeconomic imbalances, real long-term rates will at some point be high enough to finally balance demand with supply at the economy’s potential in both the financial and product markets. Other things equal, this condition will involve equity discount factors high enough to bring the rise in asset values into line with that of household incomes, thereby stemming the impetus to consumption relative to income that has come from rising wealth. This does not necessarily imply a decline in asset values - although that, of course, can happen at any time for any number of reasons but rather that these values will increase no faster than household incomes. Because there are limits to the amount of goods and services that can be supplied from increasing net imports and by drawing on a limited pool of persons willing to work, it necessarily follows that consumption cannot keep rising faster than income. Moreover, outsized increases in wealth cannot persist indefinitely either. For so long as the levels of consumption and investment are sensitive to asset values, equity values increasing at a pace faster than income, other things equal, will induce a rise in overall demand in excess of potential supply. But that situation cannot persist without limit because the supply safety valves are themselves limited. With foreign economies strengthening and labor markets already tight, how the current wealth effect is finally contained will determine whether the extraordinary expansion that it has helped foster can slow to a sustainable pace, without destabilizing the economy in the process. Technological change continues apace On a broader front, there are few signs to date of slowing in the pace of innovation and the spread of our newer technologies that, as I have indicated in previous testimonies, have been at the root of our extraordinary productivity improvement. Indeed, some analysts conjecture that we still may be in the earlier stages of the rapid adoption of new technologies and not yet in sight of the stage when this wave of innovation will crest. With so few examples in our history, there is very little basis for determining the particular stage of development through which we are currently passing. Without doubt, the synergies of the microprocessor, laser, fiber-optic glass, and satellite technologies have brought quantum advances in information availability. These advances, in turn, have dramatically decreased business operational uncertainties and risk premiums and, thereby, have engendered major cost reductions and productivity advances. There seems little question that further major advances lie ahead. What is uncertain is the future pace of the application of these innovations, because it is this pace that governs the rate of change in productivity and economic potential. Monetary policy, of course, did not produce the intellectual insights behind the technological advances that have been responsible for the recent phenomenal reshaping of our economic landscape. It has, however, been instrumental, we trust, in establishing a stable financial and economic environment with low inflation that is conducive to the investments that have exploited these innovative technologies. Federal budget policy has also played a pivotal role. The emergence of surpluses in the unified budget and of the associated increase in government saving over the past few years has been exceptionally important to the balance of the expansion, because the surpluses have been absorbing a portion of the potential excess of demand over sustainable supply associated partly with the wealth effect. Moreover, because the surpluses are augmenting the pool of domestic saving, they have held interest rates below the levels that otherwise would have been needed to achieve financial and economic balance during this period of exceptional economic growth. They have, in effect, helped to finance and sustain the productive private investment that has been key to capturing the benefits of the newer technologies that, in turn, have boosted the long-term growth potential of the US economy. The recent good news on the budget suggests that our longer-run prospects for continuing this beneficial process of recycling savings from the public to the private sectors have improved greatly in recent years. Nonetheless, budget outlays are expected to come under mounting pressure as the baby boom generation moves into retirement, a process that gets under way a decade from now. Maintaining the surpluses and using them to repay debt over coming years will continue to be an important way the federal government can encourage productivity-enhancing investment and rising standards of living. Thus, we cannot afford to be lulled into letting down our guard on budgetary matters, an issue to which I shall return later in this testimony. The economic outlook Although the outlook is clouded by a number of uncertainties, the central tendencies of the projections of the Board members and Reserve Bank presidents imply continued good economic performance in the United States. Most of them expect economic growth to slow somewhat this year, easing into the 3½-3¾% area. The unemployment rate would remain in the neighborhood of 4-4¼%. The rate of inflation for total personal consumption expenditures is expected to be 1¾-2%, at or a bit below the rate in 1999, which was elevated by rising energy prices. In preparing these forecasts, the Federal Open Market Committee members had to consider several of the crucial demand- and supply-side forces I referred to earlier. Continued favorable developments in labor productivity are anticipated both to raise the economy’s capacity to produce and, through its supporting effects on real incomes and asset values, to boost private domestic demand. When productivity-driven wealth increases were spurring demand a few years ago, the effects on resource utilization and inflation pressures were offset in part by the effects of weakening foreign economies and a rising foreign exchange value of the dollar, which depressed exports and encouraged imports. Last year, with the welcome recovery of foreign economies and with the leveling out of the dollar, these factors holding down demand and prices in the United States started to unwind. Strong growth in foreign economic activity is expected to continue this year, and, other things equal, the effect of the previous appreciation of the dollar should wane, augmenting demand on US resources and lessening one source of downward pressure on our prices. As a consequence, the necessary alignment of the growth of aggregate demand with the growth of potential aggregate supply may well depend on restraint on domestic demand, which continues to be buoyed by the lagged effects of increases in stock market valuations. Accordingly, the appreciable increases in both nominal and real intermediate- and long-term interest rates over the last two years should act as a needed restraining influence in the period ahead. However, to date, interest-sensitive spending has remained robust, and the FOMC will have to stay alert for signs that real interest rates have not yet risen enough to bring the growth of demand into line with that of potential supply, even should the acceleration of productivity continue. Achieving that alignment seems more pressing today than it did earlier, before the effects of imbalances began to cumulate, lessening the depth of our various buffers against inflationary pressures. Labor markets, for example, have tightened in recent years as demand has persistently outstripped even accelerating potential supply. As I have previously noted, we cannot be sure in an environment with so little historical precedent what degree of labor market tautness could begin to push unit costs and prices up more rapidly. We know, however, that there is a limit, and we can be sure that the smaller the pool of people without jobs willing to take them, the closer we are to that limit. As the FOMC indicated after its last meeting, the risks still seem to be weighted on the side of building inflation pressures. A central bank can best contribute to economic growth and rising standards of living by fostering a financial environment that promotes overall balance in the economy and price stability. Maintaining an environment of effective price stability is essential, because the experience in the United States and abroad has underscored that low and stable inflation is a prerequisite for healthy, balanced, economic expansion. Sustained expansion and price stability provide a backdrop against which workers and businesses can respond to signals from the marketplace in ways that make most efficient use of the evolving technologies. Federal budget policy issues Before closing, I should like to revisit some issues of federal budget policy that I have addressed in previous congressional testimony. Some modest erosion in fiscal discipline resulted last year through the use of the “emergency” spending initiatives and some “creative accounting”. Although somewhat disappointing, that erosion was small relative to the influence of the wise choice of the Administration and the Congress to allow the bulk of the unified budget surpluses projected for the next several years to build and retire debt to the public. The idea that we should stop borrowing from the social security trust fund to finance other outlays has gained surprising - and welcome - traction, and it establishes, in effect, a new budgetary framework that is centered on the on-budget surplus and how it should be used. This new framework is useful because it offers a clear objective that should strengthen budgetary discipline. It moves the budget process closer to accrual accounting, the private-sector norm, and - I would hope - the ultimate objective of federal budget accounting. The new budget projections from the Congressional Budget Office and the Administration generally look reasonable. But, as many analysts have stressed, these estimates represent a midrange of possible outcomes for the economy and the budget, and actual budgetary results could deviate quite significantly from current expectations. Some of the uncertainty centers on the likelihood that the recent spectacular growth of labor productivity will persist over the years ahead. Like many private forecasters, the CBO and the Office of Management and Budget assume that productivity growth will drop back somewhat from the recent stepped-up pace. But a distinct possibility, as I pointed out earlier, is that the development and diffusion of new technologies in the current wave of innovation may still be at a relatively early stage and that the scope for further acceleration of productivity is thus greater than is embodied in these budget projections. If so, the outlook for budget surpluses would be even brighter than is now anticipated. But there are significant downside risks to the budget outlook as well. One is our limited knowledge of the forces driving the surge in tax revenues in recent years. Of course, a good part of that surge is due to the extraordinary rise in the market value of assets which, as I noted earlier, cannot be sustained at the pace of recent years. But that is not the entire story. These relationships are complex, and until we have detailed tabulations compiled from actual tax returns, we shall not really know why individual tax revenues, relative to income, have been even higher than would have been predicted from rising asset values and bracket creep. Thus, we cannot rule out the possibility that this so-called “tax surprise”, which has figured so prominently in the improved budget picture of recent years, will dissipate or reverse. If this were to happen, the projected surpluses, even with current economic assumptions, would shrink appreciably and perhaps disappear. Such an outcome would be especially likely if adverse developments occurred in other parts of the budget as well - for example, if the recent slowdown in health care spending were to be followed by a sharper pickup than is assumed in current budget projections. Another consideration that argues for letting the unified surpluses build is that the budget is still significantly short of balance when measured on an accrual basis. If social security, for example, were measured on such a basis, counting benefits when they are earned by workers rather than when they are paid out, that program would have shown a substantial deficit last year. The deficit would have been large enough to push the total federal budget into the red, and an accrual-based budget measure could conceivably record noticeable deficits over the next few years, rather than the surpluses now indicated by the official projections for either the total unified budget or the on-budget accounts. Such accruals take account of still growing contingent liabilities that, under most reasonable sets of actuarial assumptions, currently amount to many trillions of dollars for social security benefits alone. Even if accrual accounting is set aside, it might still be prudent to eschew new longer-term, potentially irreversible commitments until we are assured that the on-budget surplus projections are less conjectural than they are, of necessity, today. Allowing surpluses to reduce the debt to the public, rather than for all practical purposes irrevocably committing to their disposition in advance, can be viewed as a holding action pending the clarification of the true underlying budget outcomes of the next few years. Debt repaid can very readily be reborrowed to fund delayed initiatives. More fundamentally, the growth potential of our economy under current circumstances is best served, in my judgment, by allowing the unified budget surpluses presently in train to materialize and thereby reduce Treasury debt held by the public. Yet I recognize that growing budget surpluses may be politically infeasible to defend. If this proves to be the case, as I have also testified previously, the likelihood of maintaining a still satisfactory overall budget position over the longer run is greater, I believe, if surpluses are used to lower tax rates rather than to embark on new spending programs. History illustrates the difficulties of keeping spending in check, especially in programs that are open-ended commitments, which too often have led to larger outlays than initially envisioned. Decisions to reduce taxes, however, are more likely to be contained by the need to maintain an adequate revenue base to finance necessary government services. Moreover, especially if designed to lower marginal rates, tax reductions can offer favorable incentives for economic performance. Conclusion As the US economy enters a new century as well as a new year, the time is opportune to reflect on the basic characteristics of our economic system that have brought about our success in recent years. Competitive and open markets, the rule of law, fiscal discipline, and a culture of enterprise and entrepreneurship should continue to undergird rapid innovation and enhanced productivity that in turn should foster a sustained further rise in living standards. It would be imprudent, however, to presume that the business cycle has been purged from market economies so long as human expectations are subject to bouts of euphoria and disillusionment. We can only anticipate that we will readily take such diversions in stride and trust that beneficent fundamentals will provide the framework for continued economic progress well into the new millennium.
|
board of governors of the federal reserve system
| 2,000 | 2 |
Remarks by Mr Roger W Ferguson Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, before the Downtown Economists Club, New York, on 17 February 2000.
|
Mr Ferguson reviews last year’s economic performance in the United States and raises some topics related to the underpinnings of macroeconomics and monetary policy Remarks by Mr Roger W Ferguson Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, before the Downtown Economists Club, New York, on 17 February 2000. * * * Thank you very much for giving me the opportunity to join this illustrious group of economists in this historic setting. The end of one year and start of another is a natural time to focus on accomplishments and also to list the things to do or to understand better. In this spirit, I would like to review last year’s economic performance and then raise some topics related to the underpinnings of macroeconomics and monetary policy. Of course, the views that I am about to express are my own and do not necessarily reflect those of other members of the FOMC or the Board of Governors. Economic performance in 1999 Last year presented various challenges for monetary policymakers. Domestic demand was particularly strong, led by consumption expenditures, which grew 5½% last year. However, consumption was only one engine driving the spectacular performance of 1999. Business fixed investment, paced by spending on producers’ durable equipment, also rose strongly - by 7%. One investment sector, the housing sector, which had been a source of considerable strength in 1998, grew less rapidly last year. Although single-family starts rose further in 1999, starts of multifamily units were off from their 1998 pace. Long-term interest rates trended downward in early 1999 but more than retraced those declines by the end of the year. Equity prices rose over much of 1999, raising the value of household assets and improving the general sense of financial well-being of our citizens as well as lowering the cost of capital faced by businesses. Recovery in Asia, with perhaps the apparent exception of Japan, and a pick-up in growth in Europe accompanied this good news in the United States. During the first half of 1998, net exports subtracted almost 1¾ percentage points from GDP growth. With conditions improving overseas, the external sector subtracted less than ¾ percentage point from growth in the second half of 1999. Throughout 1999, the Federal Open Market Committee took action to maintain balance in the economy. The sense of the FOMC was that the tightening of labor markets that accompanied a growth of demand exceeding even the stepped-up pace of supply growth would likely create upward pressures on labor costs and eventually on the rate of price inflation. Accordingly, we moved preemptively, raising rates 75 basis points in three increments. This tightening reversed the easings that had been put in place during the second half of 1998 in response to the market turmoil, including in the US markets, triggered by the unexpected events in Russia. As you well know, rates were raised another 25 basis points at our meeting earlier this month. At the February meeting, the FOMC also abandoned the approach of discussing “biases” regarding interest rate movements, which seemed to engender overly strong reactions or a misreading of our intentions from markets. Instead, the FOMC adopted the approach of discussing our views on economic developments and potential risks to good economic performance. Using this new approach, the FOMC earlier this month indicated a concern that risks were weighted mainly toward conditions that might lead to increased inflation. Sources of performance Four major forces provide the underpinning for the vigorous domestic growth we are currently experiencing. The first is the creation of, and massive investment in, information and communications technology. This capital spending, along with the apparent technological improvements, is thought to have been important in the increase in productivity - the output of goods and services per hour of work - that is currently providing such momentum for the economy of the United States. The second major force is business deregulation. The removal of unnecessary government regulation started more than 20 years ago, during the administration of President Gerald Ford, and gathered momentum during the Carter years. It has altered the business landscape. Deregulation allowed, indeed forced, businesses to focus more clearly on a market place that has become more competitive, with fewer constraints and increased flexibility. The third major force is a more prudent fiscal policy. The 1990s were characterized by a movement of federal government balances toward and then into surplus, which, many believe, has freed up resources for private-sector investment. The final major force is the reduction of both actual and expected inflation. Relatively stable prices have allowed businesses and households to plan their economic affairs with an expectation that the value of investments will not be eroded through a pernicious and uncertain increase in the general price level. Indeed, relative price level stability has reinforced the impetus provided by deregulation for businesses to manage their affairs with a priority on efficiency. Observations and open issues: resource utilization Against this backdrop, I want to raise some issues where further progress is needed if we are to understand recent macroeconomic events better. Some of these issues have been of concern for some time, and a few are new. The first issue I wish to raise involves the supply side of the economy and grows out of the recent, unusual conjuncture of rapid growth and high resource utilization with low and stable inflation. What is the proper measurement of resource tightness? The two most prominent measures of resource tightness, capacity utilization in manufacturing and the rate of unemployment, have historically moved fairly closely together over the cycle. However, they have diverged in the past several years, in part as the surge in investment has provided considerable capacity to the manufacturing sector while labor markets have become tighter and tighter. We need a better understanding of the implications of this divergence and, in particular, a clearer sense of which measure, or combination of measures, of resource utilization best foreshadows the emergence of price pressures. A second, and related, issue is whether the nature of “capacity” has changed. In the 1940s and 1950s, large-scale units of fixed machinery - such as blast furnaces and assembly lines - more normally characterized manufacturing capacity. Many observers have argued that, because this capacity required long lead times to manufacture, test and install, available capacity was easier to measure and slower to change. In such circumstances, high levels of capacity utilization were good predictors of resource tightness, which was likely to translate into pricing pressure. Now, we hear, capacity in manufacturing is more technology intensive and can be adjusted more easily to reflect supply and demand conditions. If true, this relatively “elastic” supply of manufacturing capacity would imply that capacity utilization may not become “tight” by historical standards and our measure of capacity utilization would therefore be a less-certain early warning signal of potential pricing pressure. I have seen no proof of the assertion that the nature of manufacturing capacity has changed, although the experience of the last several years suggests that it has. A third related issue has to do with the average workweek and labor force participation. During this episode of strong growth, the average workweek has not increased significantly. In 1994, the average workweek was about 34½ hours, and today it is about 34½ hours. This steadiness is in part due to mix shifts, as the economy moves away from manufacturing sector jobs, in which the 40-hour workweek is the norm, to service sector jobs, in which shorter workweeks are more common. Moreover, the labor force participation rate, while fluctuating, has remained around 67% during this period. Given the various other elements of evidence regarding labor market tightness, including survey data on job market conditions and the measured unemployment rate, I find it puzzling that both the workweek and the labor force participation rate have not increased more strongly. One theory, of course, is that household wealth, to which I turn next, might limit the felt need by some potential workers, presumably young people dependent on parents and perhaps older citizens, to participate in the labor market. Observations and open issues: equity markets Also of interest are valuations in equity markets, the role they should play in policymaking, and whether old relationships have changed. Many observers have asked if I think that the Federal Reserve can or should have a fixed view on the proper level of equity markets. Every time I consider this question, I come up with the same answer: the Fed cannot target specific levels in equity markets. Equity prices are set by the give-and-take of supply and demand, with participants buying and selling based on their own information that shapes their long-term expectations. Investors can and should be influenced by several factors, including expectations of corporate earnings, attractiveness of alternative investments (both domestic and international), and differing appetites for “ownership” risk as opposed to “creditor” risk. I believe that the Federal Reserve’s tools - primarily short-term interest rates - are too blunt to attempt to achieve specific levels of stock market valuations. I also believe that policymakers should not necessarily attempt to put their judgments of correct values above those of the market. Simply put, equity prices should properly be thought of as a relative price-the value of the existing capital stock relative to that of goods. Central banks are not good at fine-tuning relative prices. Rather, leave us the responsibility for determining the policy that anchors the general price level in the long run. However, equity markets do have important spillover effects on the real economy. As you know, economists often speak of the “wealth effect”, and econometric modeling indicates that consumers ultimately tend to spend about three to four cents for every dollar increment to wealth. In addition, consumer sentiment is tied to feelings of financial well-being. Through both of these channels, the so-called wealth effect and the more general influence on consumer sentiment, equity valuations can and do affect consumption and macroeconomic performance. Equity markets are also a significant source of investment capital, and valuations in the stock market are one determinant of the cost of capital for businesses. Therefore, equity prices affect business fixed investment, a major driver of our economy. Given the economic importance of equity prices, it is reasonable for policymakers to monitor developments in this market even if we do not “target” specific values. We have seen in other economies that the bursting of bubbles in financial markets can create unsettled conditions that affect real economic activity. Therefore, the maintenance of sound equity market conditions is of concern to policymakers, though how that can be accomplished is often far from clear. My questions about equity markets center on the issue of valuation and the wealth effect. Economists propose numerous approaches to determining the “correct” level of equity prices. One such approach compares equity market valuations (namely earnings-price ratios) to the return on fixed-income securities, generally the 10-year US Treasury bond. But many observers have suggested that this measure of the “correct” stock market valuation may no longer be accurate. Some suggest that the nature of equity markets has changed with the introduction of new instruments that allow for the better management or sharing of risks. Therefore, these observers assert that lower premiums over risk-free returns are appropriate and that old relationships between earnings-price ratios and the return on Treasury instruments no longer hold. Others argue that, in this world of knowledge-intensive industries, accounting treatments do not accurately measure true economic earnings, and therefore measures of “correct” stock valuations do not capture economic reality that market participants see. These assertions are interesting, but they need further investigation. In addition, with respect to proper valuations, we know that many businesses are using options as a form of compensation to employees and that the value of these options is not being recorded as compensation at the time they are granted. We roughly estimate that accounting for the value of options granted would have reduced reported income for S&P 1,500 firms nearly 10% in 1998. The same adjustment would have reduced the growth rate of reported income for S&P 1,500 firms almost 2½ percentage points per year, on average, during the 1996-98 period. When looked at with these refinements, the current earnings-price ratios appear even more out of alignment with historical experience. When one considers the performance of the stock market during the recent past, it is clear that the gains are not evenly distributed, even among stock market investors. Some of the greatest beneficiaries of the unprecedented generation of stock market wealth have been those with the skills and the work style to work in high-tech companies, who are rewarded with stock, and those with the courage to invest in high-tech sectors. This observation leads to several questions. First, how skewed is the distribution of gains from the stock market? If gains are indeed skewed, what then is the actual dynamic of the wealth effect? When does an unequal participation in equities become sufficiently broad-based to influence the path of our economy? I think of these questions as providing the microeconomic underpinnings to our macroeconomic performance. Finally, we have seen a run-up in margin debt, particularly during the last two months of 1999 and the first month of 2000. I believe that the Federal Reserve should not foster the impression that we are targeting the equity market by adjusting our one tool in the margin area, namely initial margin requirements. However, given our obvious interest in macro-stability, it is useful to understand more fully what has motivated this recent run-up in margin. Some argue that it reflects a desire on the part of investors to capture some capital gains, while others are quick to point out that, as a percentage of market valuation, margin has not increased dramatically. According to another theory, margin borrowing is the realm of the small investor whereas large investors finance equity purchases through other means. In any event, prudent margin procedures are an important part of sound business practice. I expect that those extending margin credit, especially the major clearing firms, as well as investors, and the public at large, would continue to recognize that conservative margin practices are in their own interest. Observations and open issues: international markets Let me turn now to the role of international developments in policy. Our mandate gives priority to price stability and maximum sustainable employment, which I think are the right elements for us to consider in policy deliberations. Therefore, I believe that international economic considerations, like stock market valuations, should receive only the focus merited because of their implications for the US economy. Certainly, developments in the international sector, in particular a large and growing current account deficit, might indicate that there are imbalances in the economy of the United States. Similarly, movements in the exchange value of the US dollar might transmit pressures on inflation, but they also are an important transmission mechanism for monetary policy. However, managing the external balance and the exchange value of the US dollar is obviously not the goal of monetary policy. One important contribution to the ongoing deterioration of our current account balance is the tendency for US residents, for reasons not fully understood by economists, to have a higher propensity to import out of every dollar of income than do residents in the countries that are our major trading partners. This, of course, means that even sustaining trend growth both in the United States and in our trading partners will not necessarily close the trade gap. To do so would require a period of sustained stronger-than-trend growth overseas. Many professional economists and market observers now question the sustainability of our current account deficit. I suggest that the combination of a large current account deficit and a strong US dollar is, in part, a reflection of the relative attractiveness of the US economy as a destination for foreign capital. We are attracting savings from abroad because we currently have a higher rate of return on investment than many other countries do. This state of affairs, of course, is a reflection of the factors that have given rise to this long period of domestic prosperity. How much longer prosperity can continue is obviously the key question. The answer depends on the ability of businesses and workers in the United States to continue to generate growing productivity increases. Because few of us forecast the current period of investment-driven productivity increases, it is difficult to predict when it will end. We know only that, at some point, productivity will stop accelerating. But the potential for change is not the exclusive domain of the United States. The ability of other countries to adjust their systems of production to take advantage of new technologies will be an important determinant of when other markets will offer returns that are comparable to those available in the United States. I imagine that, seeing the gains that we have experienced from our capital deepening, investors would look for the earliest signs of a similar phenomenon in other countries. The one lesson we can offer is that the configuration of competitive and flexible markets, management focused on shareholder value, and supportive macroeconomic conditions is not achieved without costs in terms of some societal dislocations and that configuration requires good fortune, sacrifice and discipline. Conclusion As you can see, these are interesting times in which to be a central banker. This complex set of forces requires continued vigilance on our part. Additionally, the last 24 months have raised an important set of questions regarding measures of real economic performance, the behavior of inflation, financial market indicators, and the growing globalization of today’s economy. I highlighted some of these questions with you today. I have enjoyed immensely having to grapple with these issues but recognize that we at the central bank may have all of the right questions but do not have all the answers.
|
board of governors of the federal reserve system
| 2,000 | 2 |
Speech by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the 16th Annual Policy Conference, National Association for Business Economics, Washington, D.C. on 23 February 2000.
|
Mr Meyer answers the question “How does a surplus affect the formulation and conduct of monetary policy?” Speech by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the 16th Annual Policy Conference, National Association for Business Economics, Washington, D.C. on 23 February 2000. * * * This is a conference on fiscal policy, specifically on how to allocate any potential future surpluses among debt reduction, tax cuts, and spending increases. The question you and I might be asking right now is: what is a monetary policymaker doing here? You know, of course, that I always enjoy visiting with my fellow NABE members. But what contribution can I make to this important topic? My assignment is to answer this question: How do the prevailing surplus and, especially, decisions about the allocation of future potential surpluses affect the formulation and conduct of monetary policy? The best way to understand the issues involved, in my view, is through the concept of the policy mix. Using this concept will allow us to focus on the implications of alternative combinations of monetary and fiscal policies - yielding the same level of aggregate demand - for the real interest rate, the composition of output, and the current account balance. And it also will help us understand the key source of the interdependence of monetary and fiscal policy decisions. In planning my remarks today, I intended first to discuss the analytics and the politics of the policy mix and then to illustrate the power of the analysis by using it to explain some of the important features of our current experience. For example, the swing in the structural budget balance clearly helps to explain the increase in the national saving rate and the high level of investment relative to GDP in this expansion. However, other important features of recent macroeconomic experience specifically the absence of a large decline in real interest rates and the significant deterioration in the current account balance - appear at least superficially inconsistent with the predictions from the standard analytics. One problem, I believe, is the assumption that the recent dramatic swing in the structural budget balance has been driven exclusively by policy. Perhaps even more important are other developments that have affected real interest rates and the current account balance over the same period. I could sum up these problems by noting that “ceteris” aren’t always “paribus”. I conclude by discussing the implications for monetary policy of options for the allocation of the “potential” surplus. Because these options involve alternative policies, the standard framework should provide appropriate guidance about projected outcomes - ceteris paribus, of course. I focus on two ways in which fiscal policy choices could affect monetary policy. First, fiscal policy choices will affect the interest rate consistent with full employment and price stability. Second, a gradual reduction and ultimate retirement of the federal government debt would require changes in the way monetary policy is implemented. The analytics of the policy mix The policy mix is a very useful device for understanding the interaction of monetary and fiscal policies. There are infinite combinations of monetary and fiscal policies that yield the same level of aggregate demand. The different combinations result in different outcomes for the level of real interest rates, the composition of output, and the current account balance. The composition of output at a given time has implications for the level of output over time. So the policy mix has quite important implications for macroeconomic performance. The policy mix can be easily represented in a simple IS-LM diagram, shown in Figure 1 (6 KB). This illustrates a simple model of macroeconomic general equilibrium. The IS curve shows the combinations of interest rate (R) and output (Y) consistent with equilibrium in the output market. The LM curve shows the combinations of interest rate and output consistent with equilibrium in the market for money. The intersection of the curve shows the unique combination of the interest rate and output consistent with equilibrium in all markets simultaneously. Let’s start from a simple case in which the level of capacity is given at a moment in time and the intersection of IS and LM curves determines the prevailing level of aggregate demand. As long as this intersection occurs at or to the left of the capacity limit, potential output (Y*), it determines the level of output, at least in the short run. I will assume that the intersection takes place at potential output, so that the outcome is consistent with the broad objectives of monetary policy: price stability and full employment. Evaluating alternative fiscal policies at an unchanged level of output - and specifically at potential output - also allows us to abstract from cyclical changes in the deficit and thereby to focus exclusively on changes in the structural budget balance. The position of the LM curve is determined in part by the stance of monetary policy. I have drawn the conventional upward sloping LM curve predicated on a constant money stock. In this case, we could view the level of the money stock as being adjusted by open market operations to be consistent with the Federal Reserve’s target for the interest rate. Alternatively, we could draw the LM curve as a horizontal line at the prevailing interest rate target set by the FOMC. The position of the IS curve is determined in part by fiscal policy, including the level of discretionary spending and benefit and tax rates. The latter rates also affect the slope of the IS curve, but I will abstract from that detail. In Figure 1 (6 KB), I depict two of the infinite number of combinations of IS and LM curves that intersect at potential output - the intersection at B corresponding to a looser fiscal and tighter monetary policy than the intersection at A. The first question I want to answer is, What difference does it make where the intersection of IS and LM curves occurs for a given level of output? The answer is, It makes a world of difference. A combination of tighter fiscal and looser monetary policy (point A compared with point B in Figure 1 (6 KB)) implies a lower interest rate, a higher share of investment (and, in general, of interest-sensitive components of spending) in GDP, a lower value of the dollar, and a higher level of net exports and hence of the current account surplus. Therefore, by selecting a particular policy mix, policymakers can affect the amount of capital formation and the current account balance. The above simple version of the IS-LM model does not begin to do justice to the topic. In more sophisticated models, we would see explicitly the long-run increase in output associated with a shift in the composition of output today toward more investment. In addition, more sophisticated models would also show explicitly the increase in the exchange rate and the associated decline in the current account balance in response to a policy mix that resulted in higher real interest rates. Finally, more sophisticated models could also take into account a second dimension of fiscal policy changes. Not only do changes in tax rates and spending affect aggregate demand, they can also affect aggregate supply, for example, when they alter after-tax wage rates and after-tax rates of return to saving and investment. The effect of any change in the policy mix will also be affected by supply-side incentives incorporated in the fiscal part of the mix. The politics of the policy mix: does monetary policy respond to fiscal policy? How does the choice about the policy mix get made? An interesting question! No, we do not have a joint committee on the policy mix that considers the benefits of alternative mixes and reaches a judgment. The policy mix is determined by sequential decisionmaking, subject to an understanding of the structure of the economy and of the likely responses of the other policymakers to one’s policy actions. In my view, the process works as follows. The Administration and the Congress together make decisions that determine the fiscal part of the policy mix. Over the last 20 years, these decisions generally have been based on considerations that have more to do with long-run objectives such as promotion of higher longer-term growth, than with short-run stabilization. Making these decisions takes considerable time because of the dynamics of the annual budget process and the legislative process. The current year’s decisions are incorporated and the following years’ decisions are anticipated in the fiscal policy assumptions underlying the Federal Reserve’s forecast, extending out a year or two. The Federal Reserve then sets its policy to achieve the broad objectives assigned to it, specifically, price stability and full employment. Fiscal decisions are, in turn, affected by budget forecasts that are partly contingent on monetary policy assumptions. In effect, fiscal policymakers make the fundamental decision about the policy mix. Monetary policymakers smooth the transition to the new equilibrium, by minimizing the effect on output relative to full employment and on prices. Because monetary policy adjusts continuously to changes in the economy, including those resulting from fiscal policy, it makes sense to think that fiscal policy decisions are made first and monetary policy decisions are conditional on the fiscal decisions. So does this mean that monetary policy responds directly to fiscal policy actions? I believe it is more accurate to say that the Fed’s response is indirect. That is, monetary policy responds to changes in fiscal policy in much the same way that it responds to other influences on the economy, such as equity prices, exchange rates, or the demand for US exports due to changed growth prospects abroad. Each and all of these developments affect both the macroeconomic developments and the forecast that drive adjustments in monetary policy in pursuit of full employment and price stability. It is also important not to overstate the role monetary policy plays in shaping the policy mix. Indeed, the fiscal policy decision uniquely determines the policy mix. In terms of our diagram, the game is fundamentally over when the fiscal decision pins down the intersection of the IS curve and the vertical line at full employment. The only question remaining is how the LM curve will come to intersect at the same point. One possibility, of course, is that the Federal Reserve adjusts its open market operations to move the interest rate to this point. In a regime in which the Fed implements policy by choosing a target for the money stock, either the money stock could be adjusted to move the LM curve to this point, or price flexibility would ultimately get the job done, with the emphasis on the ultimately. The role of active monetary policy is to avoid the need for price flexibility - that is, to prevent the fiscal decision from either temporarily lowering output below its full employment level or permanently increasing prices. In an interest-rate regime, holding the funds rate unchanged following a shift in the IS curve would result in an escalating disequilibrium. For this reason, an interest-rate regime has to be modeled in terms of a policy reaction function. Under a plausible policy rule - for example, utilizing actual or forecast output gaps and inflation rates - the interest rate would be reset over time until it was consistent with the intersection of the IS curve and the vertical line at potential output and price stability. The message of such an interest rate rule is that monetary policy responds only indirectly to fiscal policy. That is, the rule specifies the adjustment of the interest rate to changes in output and prices (the indirect approach), not the adjustment to changes in tax rates or spending (as would be the case in a direct approach). However, if we look at the reduced form equation for the now endogenous monetary policy instrument, we shall, to be sure, find - lurking on the right-hand side - the exogenous components of the fiscal policy decision - the level of discretionary spending and the benefit and tax rates. That the response may be indirect does not make it any less systematic. In the “real world”, of course, many factors besides fiscal policy are likely to be affecting inflation and output. So we may rarely actually observe the interrelationships implied by the analytics of the policy mix. That is, ceteris are rarely paribus, which brings me to my next topic. Comparing the theoretical prediction to recent experience The swing from budget deficit to surplus has been much more dramatic than was expected when the fiscal year 1994 budget was adopted. Such a dramatic swing in the budget balance might have been expected to yield a particularly dramatic confirmation of the predictions of the analytic model, in the form of a sharp decline in real interest rates and a significant improvement in the current account balance. In fact, during the period of a dramatic swing in the budget balance, real interest rates have not declined, and the current account balance has significantly deteriorated. If you hadn’t already suspected, we are about to find out that the real world is always more complex, and much more interesting, than our simple models. Two explanations for this conflict between theoretical prediction and recent experience are possible. First, the preceding analysis is fundamentally flawed. Second, it is at least incomplete. I will take the second route and argue that the problem was that ceteris were not paribus in this episode. Ceteris paribus, of course, means “all other things being equal”. In our models, it is a way of examining the effect of one shock, holding constant all other shocks that could effect the variables in question. In class and in model simulations, we can always impose ceteris paribus. Indeed, ceteris paribus was implicitly assumed all the way through the section on analytics. In the real world, we do not have this option. In addition, in the analytics, by holding output fixed at potential, we abstracted from the effect of cyclical influences on the budget balance, on interest rates, and on the current account. The key to getting the right answer in the analysis is to identify correctly the multiple shocks that induced the swing in the ratio of deficit to GDP and that affected the real interest rate and current account, as well as to take into account the effect of the cyclical developments. I assumed in the analytics of the policy mix that any non-cyclical change in the deficit was due to a change in policy some combination of increases in tax rates and cutbacks in government spending. Indeed, specific fiscal policy actions make this characterization appear qualitatively correct. But to explain the realworld outcomes we also have to allow for the role of other influences, specifically structural change. A portion of the swing in the budget balance was of course due to the cyclical upswing. In our analytics, we were able to control this by holding the level of output constant at potential output. But to apply this to our dynamic economy, we have to allow for uncertainty about the level of potential output at a given time and the growth in potential output over time. A unique feature of the recent cyclical experience has been the divergence between the cyclical strength of the US economy and that of its major trading partners - that is, the weak expansion in Europe, the long period of stagnation in Japan, and the crises among many emerging market economies from late 1997 through 1998. This divergence in cyclical strength was accompanied for several years during this episode by a persistent appreciation of the dollar, further contributing to a deterioration in the current account. However, the more serious problem with the application of the conventional framework is the failure to account for structural change, specifically a decline in the non-accelerating inflation rate of unemployment (NAIRU) and an increase in trend growth. The decline in the NAIRU, for example, would raise the equilibrium level of output and increase imports, worsening the current account balance. More rapid trend income growth would also increase the growth of imports and hence cause a deterioration in the current account. In addition, higher trend productivity growth, driven by an increase in the profitability of investing in new technology, would raise the equilibrium real interest rate and encourage capital flows to the United States to take advantage of higher returns on capital. This in turn would lead to an appreciation of the dollar, further augmenting the current account deficit. Once again, it appears that the increase in the economy’s trend rate of productivity growth is playing a starring role in our explanation of recent macroeconomic developments. In discussing the interaction between the swing in the federal budget and the increase in the productivity trend, I have focused on how the latter may have offset the tendency of the former to lower real interest rates. But, of course, the interaction works in both directions. The increase in the equilibrium real interest rate, expected as a result of an increase in the productivity trend, is also a ceteris paribus result. It depends critically on the assumed fiscal policy rule. For example, if tax rates are constant and government spending remains a constant share of GDP, then the higher productivity trend will, in general, yield a higher equilibrium real interest rate. However, if real (or nominal) government spending is held constant, the surplus will rise over time as a share of GDP, putting downward pressure on the equilibrium real rate, offsetting, at least in part, the effect on the real rate of the higher trend productivity. The surplus conundrum and the policy mix Although the discussion of the policy mix did not fully explain recent economic performance, I still believe it offers important insights into the alternative options for dealing with the current and projected surpluses, at least if ceteris really turn out to be paribus. But the recent episode underscores the difficulty of actually predicting real interest rates and the current account when unknowable shocks will surely intervene along the way. My assignment is not to evaluate how much confidence we should have in projected surpluses or to assess the merits of alternative allocations of projected surpluses. Rather, my assignment is to connect fiscal and monetary policy decisions. I have emphasized the value of evaluating fiscal policy options in models that incorporate reasonable monetary policy reaction functions. Doing so builds in the indirect response of monetary policy to fiscal policy changes, consistent with the logic of the policy mix. This is, in effect, the approach I followed in my discussion of the analytics of the policy mix by assuming that monetary policy is reset - albeit indirectly - in response to a change in fiscal policy. Retaining the projected non-social security surplus in the CBO baseline is one alternative. The other options involve increased fiscal stimulus and, as a result, will be accompanied, ceteris paribus, by a higher equilibrium real interest rate in the long run. They would also likely result in a lower share of investment in output and a higher current account deficit than the option that preserves higher surpluses. The specific outcomes will depend on the details, especially on the nature of supply-side incentives in any spending or tax changes. Ceteris paribus, monetary policy - run off a sensible reaction function - will end up validating the higher equilibrium real interest rate, to keep prices from accelerating indefinitely. The implications of debt retirement for monetary policy operations One additional consequence of the choice among these options is that preserving the surpluses would lead to a gradual decline in and ultimately the retirement of the federal government debt. The prospects for such an outcome depend, of course, on the realized growth rate of income and hence tax revenue and one’s assumption about the starting base of expenditures and the appropriate baseline for its rate of growth. I have already noted that such an outcome should not be viewed as a foregone conclusion. And, even if the Treasury debt were to be fully retired, that outcome would be likely to be transitory. Once the baby boomers begin to retire, the social security trust fund will be progressively run down, and at some point, the overall budget will likely move from surplus to deficit again. Today, Treasury securities account for the bulk of the Federal Reserve’s portfolio of assets. Treasury securities are a convenient and natural choice for the Federal Reserve’s portfolio because they pose no credit risk and because the depth and liquidity of the Treasury market facilitate open market operations. Although the Treasury market has been the traditional vehicle for monetary policy operations in recent decades, the Federal Reserve Act provides authority for the Federal Reserve to purchase a fairly wide range of other assets - including obligations of federal agencies, certain obligations of state and local governments, foreign exchange, and sovereign debt. Moreover, the Federal Reserve often supplies reserves through repurchase transactions in addition to outright purchases. If the Treasury market became less liquid, we could substitute longer-term RPs against eligible collateral for some outright purchases of securities. If the existing classes of assets that the Federal Reserve is authorized to purchase or to acquire were deemed too narrow, we could pursue technical changes in the Federal Reserve Act to authorize transactions with a broader range of assets. Still another option would be to expand the role of the discount window in the provision of reserves to the banking system. The key point is that declining Treasury debt does not pose any insurmountable long-term problem for the Federal Reserve. There would, of course, be transitional issues as monetary policy operations adapted. But we surely maintain the effectiveness of our monetary policy operations. So a decision about whether or not to hold on to the surpluses and ultimately retire the government debt should not be affected by any concern that this option might undermine the effectiveness of monetary policy. Some have been concerned that the Federal Reserve and the Treasury might be working at cross purposes today to the extent that reductions in the Treasury debt supply have lead to declines in longer-term Treasury rates at a time when monetary policy is aiming to slow the pace of economic activity to a more sustainable rate. To date, the main impact of Treasury operations and debt management prospects has been on longer-term Treasury rates, with only a small spillover effect on the financial variables that affect private spending decisions - short- and longer-term private interest rates, equity prices, and exchange rates. To the extent that Treasury debt management operations affect the private interest rates or other financial conditions that matter for private spending decisions, the FOMC can always adjust its policy settings as appropriate to achieve its objectives. Conclusion In debt management as well as other fiscal policy decisions, the Administration and the Congress should make the decisions that are in the best long-run interest of the economy. Monetary policy cannot affect the long-run consequences of such policy decisions, but it can adjust to smooth the economy’s transition to the new equilibrium. Monetary and fiscal policies should therefore be thought of as working as partners, rather than in competition with each other. In addition, decisions about the allocation of the government surplus or about debt management operations should not be limited by any concern that a gradual decline in or even ultimate retirement of the government debt would undermine the ability of monetary policy to achieve the broad objectives that the Congress has assigned to it.
|
board of governors of the federal reserve system
| 2,000 | 2 |
Speech by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the Derivatives Risk Management Symposium, Institute on Law and Financial Services, Fordham University School of Law, New York, on 25 February 2000.
|
Mr Meyer looks at the prospects for strengthening risk management for derivatives Speech by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the Derivatives Risk Management Symposium, Institute on Law and Financial Services, Fordham University School of Law, New York, on 25 February 2000. * * * It is a pleasure to be with you today. These conferences sponsored by the Institute on Law and Financial Services bring together a diverse group of individuals with interests in banking and financial markets, facilitating discussion of issues that cut across disciplines. This year’s program, which explores topics associated with risk management for derivatives, is no exception. The events in markets during the last few years have given market participants and policymakers ample incentive to reevaluate risk management procedures related to derivatives. The collapse of Long-Term Capital Management prompted studies by the President’s Working Group on Financial Markets, the banking and securities supervisors, and private market participants. With its study of LTCM behind it, the President’s Working Group released a long-awaited report that evaluates the regulatory framework for over-the-counter derivatives. A major focus of these efforts has been the challenge of managing counterparty credit risk - that is, the risk that a counterparty will not settle an obligation for full value, either when the obligation is due or at any time thereafter. The policies and procedures within individual firms and the techniques by which individual firms measure and manage counterparty risk, are prominent themes in both the report by the President’s Working Group on hedge funds and the guidance from bank supervisors that followed LTCM. However, less focus has been placed on the ways in which collective efforts to strengthen market infrastructure could reduce risk. This is a central theme of the President’s Working Group report on OTC derivatives. Its recommendations would enable market participants, working together, to develop new trading and clearing structures. The private-sector group known as the Counterparty Risk Management Policy Group (CRMPG) also calls for important cooperative efforts related to collateral programs, as does the International Swaps and Derivatives Association (ISDA). My focus this afternoon will be on the importance of collective efforts to strengthen market infrastructure as part of the overall approach to the risk management of derivative transactions. At first, these two strands of inquiry may appear to have little in common. On reflection, however, we see that both attempt to broaden the range of tools that market participants use to manage the risks arising from OTC derivative activity. One set of studies and recommendations takes the perspective of individual firms and offers steps that firms can implement themselves to enhance risk management. The Working Group’s OTC study highlights ways in which collective efforts could enhance risk management. 1. Collective action to strengthen market infrastructure Today’s OTC infrastructure remains decentralized; trades are executed and settled bilaterally. A review of that infrastructure reveals weaknesses or limitations in existing practices. Some of these weaknesses can be, and are being, addressed by individual firms. Other weaknesses require cooperation and collective action by firms. For example, firms could reduce risk and increase efficiency as they improve their individual back office procedures. Risk could also be reduced through centralized mechanisms to execute and settle trades. Regulatory, legal, and operational barriers currently prevent some of these mechanisms from being used. Many of the recommendations in the Working Group’s report on OTC derivatives would remove unnecessary legal and regulatory barriers to such innovations. A challenge that policymakers always face in such times is to let markets evolve naturally and to refrain from attempting to dictate changes. The recommendations in the Working Group’s report reflect this evolutionary philosophy. My goal today is to review areas in which improvements are possible and to highlight the role that public policymakers can play. 2. Trading Status quo. The status quo for the trading of OTC derivatives is a system of telephones and voice brokers. It is a bit incongruous that the financial instruments employing the most sophisticated asset-pricing technology are traded, by and large, by means of the lowest technology. Most OTC derivative transactions are executed by telephone between the traders acting for the two counterparties. Conversations are almost invariably recorded, and these recordings are used as evidence of the existence and terms of a trade if disputes arise. Traders are responsible for ensuring that prospective deals fall within credit lines for the counterparty and within overall trading limits. Firms have widely varying means for ensuring compliance with credit limits. Some firms have on-line systems through which traders can check the availability of credit lines. In other firms, the process is manual, and traders apply to a relationship officer before executing a trade. Voice brokers are used in some transactions, most frequently for common and relatively standardized transactions. Brokers frequently are used for single-currency interest rate swaps and forward rate agreements. As in the spot foreign exchange markets, brokers are used to locate counterparties for these trades. They do not act as principal. Once counterparties who are willing to transact at the quoted price have been identified, brokers reveal their names so that they can determine if each other’s credit quality is acceptable and if the exposure can be accommodated within credit limits. As you see, automation of trading is limited in the OTC derivatives market. Trading systems for foreign exchange. Among financial products, foreign exchange probably has recorded the most dramatic shifts in trading mechanisms in the last few years. The volume of foreign exchange traded through electronic brokering systems has grown rapidly and now accounts for some three-quarters of the trading in major markets. Besides these systems in the interdealer market, foreign exchange dealers also have developed electronic trading facilities for customers. Most bank dealers in foreign exchange have websites that allow their customers to trade electronically. Electronic trading between customers does not seem to be happening, although systems reportedly are under development. Despite the enthusiasm with which electronic enhancements to trading have been embraced in the spot foreign exchange market, vendors have not successfully extended their services to derivative products involving foreign exchange. A service for the trading of forward foreign exchange has attracted meager volumes, as has a service for the trading of forward rate agreements. The latter service has been hampered by its inability to offer its products in the United States. If the recommendations of the President’s Working Group were enacted, this trading system would be excluded from the Commodity Exchange Act (CEA), and the legal status of products offered through the service would be clear, likely enhancing its attractiveness. Trading systems for swaps. Development of electronic trading systems for swaps also is likely being hampered by the potential application of the CEA. The CFTC has raised questions about the applicability of the CEA to a system that electronically matches swap trades between dealers. This system automates the functions that voice brokers currently provide in the interdealer market. Participants in the system electronically indicate their desire to enter into specific transactions. Other dealers can accept a transaction, or they can send an electronic message suggesting possible changes in terms. Participants can execute trades only with other dealers for whom they have acceptable credit limits. The credit limits of all dealers vis-à-vis each other are loaded into the system before trading. The managers of this system believe that regulatory uncertainty about the application of the CEA has slowed the growth of their business, too. Potential risk-management benefits. What are the implications of these developments (or potential developments) for risk management? The most immediately apparent benefits spring from the changes that electronics bring to information flows. Currently, deals are struck over telephones, perhaps with the assistance of voice brokers. The data from those trades must be entered into the firms’ information systems. For some firms, data may even be keyed more than once. Electronic systems allow the quick, accurate capture of data. The data can then be used to update risk management information systems rapidly. Other benefits may be realized in firms’ ability to manage their credit limits. A feature of many electronic systems is credit limits that are programmed into the system. This feature narrows the ability of rogue traders to expose firms without the knowledge of risk managers. Just as electronic systems can generate data useful in internal risk management, they can also generate data about the market for financial instruments. The rapid growth and widespread acceptance of electronic brokering reportedly has made the pricing process more transparent in the foreign exchange market. Dealers no longer have to do a transaction to discover where the market is trading. Similar improvements in price transparency could be expected in other products. End-users in these products may also reap benefits if competition among dealers increases and bid-offer spreads narrow. 3. Settlement Status quo. Currently, settling derivatives transactions requires lots of paper and manual labor. Counterparties must confirm the details of deals with each other. The confirmation lists the economic features of the transaction as well as many legal terms. ISDA has developed templates for confirmations that market participants use for many products, but tailor-made confirmations may be necessary for certain products or certain counterparties. In some instances, confirmations are generated electronically, but for a range of products, the process is manual. Even electronically generated confirmations often must be manually verified by counterparties. Many confirmations are faxed between counterparties. S.W.I.F.T., an interbank messaging system, is used to confirm foreign currency options, forward rate agreements, and cross-currency swaps, but its usefulness is limited because both counterparties must employ the system. Electronics thus are not the rule in confirmation processing. Not surprisingly, the result has been significant backlogs. Active dealers report hundreds of unconfirmed trades. A small but significant share may be outstanding ninety days or more. Another feature of settlements in OTC derivatives in recent years has been the development of collateral programs. US dealers, in particular, have rapidly expanded their use of collateral to mitigate counterparty credit risks. In these programs, counterparties typically agree that, if exposures change over time and one party comes to represent a credit risk to the other, the party posing the credit risk will post collateral to cover some (or all) of the exposure. These programs offer market participants a powerful tool for helping control credit risk, but they also embody substantial documentation and operational challenges. Alternative procedures could lead to risk reduction in many areas of the settlement of OTC derivatives. Collateral programs themselves could be strengthened to provide even greater benefits. Need to address backlogs. Most dealers acknowledge that the failure to confirm trades heightens the risk that the transaction will be unenforceable. Because unconfirmed trades create the potential that errors in trade records and management information systems will go undetected, they also create the possibility that both market and credit risk will be measured incorrectly. Quantitative measures of market risk and credit risk are only as good as the transaction data on which they are based. Most firms active in the markets recognize the need to address the backlogs in confirmations. The CRMPG report, for example, directed several recommendations toward firms’ need to improve market practices in this area, and it set a target of confirming trades within five days of the trade date. The problem of backlogs can be attacked from two directions. First, individual firms can strengthen their policies with regard to confirmations. They could monitor backlogs more carefully and devote resources to reducing those backlogs. They could enhance internal systems for capturing trade data and generating confirmations. Management can place a priority on reducing the backlog and assign clear responsibility for such reduction. Second, backlogs can be reduced by standardizing and electronically matching confirmations or by developing electronic trading systems that create a match at the time of the trade. These latter methods for reducing backlogs clearly must be pursued collectively. A system for electronically matching confirmations will be helpful only if substantial portions of a firm’s counterparties use the same system. Similarly, electronic trading systems require a critical mass of participants. Need to strengthen collateral management practices. The volatile market conditions that surrounded the LTCM episode provided a test of many firms’ risk management systems and particularly their collateral programs. During the episode, collateral successfully mitigated credit risk, as designed. However, events also highlighted weaknesses in current policies and programs. These weaknesses have been examined in some depth in studies both by banking supervisors and by ISDA. It is important to address them because improperly managed collateral programs may give firms a false sense of security. Reductions in credit risk may not be as great as perceived. The studies point to ways in which individual firms can make collateral programs more effective. First and foremost, the studies emphasize that collateral is a complement to credit analysis; it is not a substitute for credit analysis. Supervisors observed that some firms accepted counterparties that were unwilling to provide information about their risk profile as long as they were willing to post collateral. The fallacy of that approach was vividly demonstrated. Positions may rapidly change in value, creating uncollateralized credit exposures to counterparties of unknown creditworthiness. A main point of these studies was the need for counterparties both to recognize the potential for unsecured exposures to arise in the future and to measure such exposures more carefully. A second theme in the studies was the need for counterparties to recognize that, although collateral programs mitigate credit risk, they also introduce operational, liquidity, and legal risks. To realize the benefits of the collateral program, back office systems must be very robust. That is, counterparties must be able to value portfolios, track collateral posted, call for any deficiencies, and verify the timely receipt of collateral. Various studies have noted that firms relying on collateral programs must establish rigorous controls and devote sufficient resources to them. Additional liquidity and legal risks also arise with collateral programs. Counterparties entering into collateral agreements must ensure that they themselves can deliver collateral in a timely fashion, and the extensive legal documents related to the program must be enforceable in relevant jurisdictions. Clearing. A device used to mitigate credit risk among groups of participants in many financial markets is a clearinghouse. A clearinghouse typically substitutes itself as central counterparty to all transactions that its members agree to submit for clearing. The creditworthiness of the clearinghouse is thus substituted for that of its members. The clearinghouse assumes the responsibility for managing credit risk through financial safeguards such as membership standards, capital requirements, and collateral systems. The clearing of OTC derivatives is quite limited. Some clearing of OTC products has been conducted in Sweden for several years. Last year, the London Clearing House (LCH) began offering such services more broadly, but volumes to date have been limited. In the United States, the clearing of OTC derivatives has been hampered by legal uncertainty associated with the possible application of the Commodity Exchange Act. If the recommendations in the report of the President’s Working Group are implemented, this uncertainty will be resolved. Clearinghouses for OTC derivatives could be created under various regulatory regimes. The only restriction would be that the clearinghouse be supervised. That supervision could be provided by the CFTC, the Securities and Exchange Commission, the Comptroller of the Currency, or the Federal Reserve. Clearing OTC derivatives offers several potential benefits. A clearinghouse can substantially mitigate counterparty credit risk through multilateral netting of obligations and implementation of sound risk controls. Legal risks tend to be reduced with clearing because the default procedures of clearinghouses are supported by specific provisions of national law. Clearinghouses also usually impose stringent operational standards on members, and they likely would provide added impetus to efforts to develop automated systems for confirming trades. But clearing has limitations, too. Clearinghouses tend to concentrate risks and risk management. The key issue is how effectively a clearinghouse manages the risks it assumes. The record on risk management provided by clearinghouses in the United States is quite good. The same safeguards could be applied in the OTC context, particularly for relatively simple OTC products. But certain hurdles that arise because of the nature of OTC markets would have to be overcome. Contracts in the clearing process would have to be valued by models rather than by prices generated on an exchange floor. More important, in the event of a member default, OTC products likely would take longer to close out than exchange-traded products. This hurdle could be overcome, however, by the imposition of higher margin requirements on members or by the clearinghouse’s maintaining larger supplemental financial resources. Clearing also may have the perverse effect of increasing risk on counterparties’ portfolios of noncleared contracts. Clearinghouses for OTC contracts typically propose to clear the relatively simple OTC instruments. The remaining contracts will be settled bilaterally between the two counterparties, as they are today. The bilateral exposures on the noncleared contracts might increase to some degree if the contracts that were removed for clearing had been offsetting some of that exposure. The magnitude of this effect will vary from counterparty to counterparty because it is portfolio specific. Individual counterparties, therefore, must carefully assess the potential benefits of clearing. 4. Conclusion: the role of public policy The role of public policy is to encourage sound risk management. In this regard, policymakers are encouraging firms to enhance their risk-management systems, including appropriate management oversight, adequate risk-management policies and procedures, effective risk-measurement and monitoring systems, comprehensive internal controls and independent external audit. Policymakers simply cannot dictate the details of risk management based upon assumed market developments. Markets currently are in tremendous flux, and policymakers cannot foresee the needs in future years. Thus, the soundest course is to create a clear legal and regulatory environment within which market participants can develop risk management tools as needed. In some instances, rather than mandating certain steps, policymakers might provide proper incentives for risk-reducing steps through capital requirements or disclosure regulations. The recommendations in the Working Group’s report on OTC derivatives, in particular, aim to clarify the legal status of electronic communication, trading, and clearing systems. These recommendations, if implemented, offer a variety of enhanced risk management tools to market participants. Information flows could be improved, backlogs in the settlement process might be reduced, and the benefits of multilateral netting might be realized. However, it remains for market participants to develop the systems that will best serve markets within the given legal framework. I hope that market participants pursue these opportunities. The risks of trading and settling OTC derivatives can and should be reduced. Obviously, things that firms can pursue individually are easier to tackle than those requiring collective action. But the potential benefits of new communication, trading, and clearing systems should be evaluated carefully. Participants in OTC markets have worked together in the past developing standard master agreements, for example, or obtaining legal opinions on netting. Those experiences should be useful as these new opportunities for reducing risks are addressed. The challenge for policymakers as these new opportunities are evaluated may well be doing nothing. Policymakers no doubt will be tempted to mandate cooperation on the part of market participants in an effort to hasten developments that they believe may reduce risk. In adopting such a course, however, they risk pushing markets and market participants down inefficient and undesirable paths. Often market participants are not pursuing steps that, on the surface, appear desirable because the benefits do not seem sufficient. In that event, policymakers’ efforts would be better spent in helping demonstrate that the benefits have not been appropriately evaluated rather than in dictating market structures.
|
board of governors of the federal reserve system
| 2,000 | 2 |
Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Boston College Conference on the New Economy, Boston, on 6 March 2000.
|
Mr Greenspan focuses on the revolution in information technology and its implications for key government policies Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Boston College Conference on the New Economy, Boston, on 6 March 2000. * * * In the last few years it has become increasingly clear that this business cycle differs in a very profound way from the many other cycles that have characterized post-World War II America. Not only has the expansion achieved record length, but it has done so with economic growth far stronger than expected. Most remarkably, inflation has remained largely subdued in the face of labor markets tighter than any we have experienced in a generation. A key factor behind this extremely favorable performance has been the resurgence in productivity growth. Since 1995, output per hour in the nonfinancial corporate sector has increased at an average annual rate of 3½%, nearly double the average pace over the preceding quarter-century. Indeed, the rate of growth appears to have been rising throughout the period. My remarks today will focus both on what is evidently the source of this spectacular performance - the revolution in information technology - and on its implications for key government policies. When historians look back at the latter half of the 1990s a decade or two hence, I suspect that they will conclude we are now living through a pivotal period in American economic history. New technologies that evolved from the cumulative innovations of the past half-century have now begun to bring about dramatic changes in the way goods and services are produced and in the way they are distributed to final users. Those innovations, exemplified most recently by the multiplying uses of the Internet, have brought on a flood of startup firms, many of which claim to offer the chance to revolutionize and dominate large shares of the nation’s production and distribution system. And participants in capital markets, not comfortable dealing with discontinuous shifts in economic structure, are groping for the appropriate valuations of these companies. The exceptional stock price volatility of these newer firms and, in the view of some, their outsized valuations indicate the difficulty of divining the particular technologies and business models that will prevail in the decades ahead. How did we arrive at such a fascinating and, to some, unsettling point in history? While the process of innovation, of course, is never-ending, the development of the transistor after World War II appears in retrospect to have initiated a special wave of innovative synergies. It brought us the microprocessor, the computer, satellites, and the joining of laser and fiber-optic technologies. By the 1990s, these and a number of lesser but critical innovations had, in turn, fostered an enormous new capacity to capture, analyze, and disseminate information. It is the growing use of information technology throughout the economy that makes the current period unique. However, until the mid-1990s, the billions of dollars that businesses had poured into information technology seemed to leave little imprint on the overall economy. The investment in new technology arguably had not yet cumulated to be a sizable part of the US capital stock, and computers were still being used largely on a stand-alone basis. The full value of computing power could be realized only after ways had been devised to link computers into large-scale networks. As we all know, that day has arrived. At a fundamental level, the essential contribution of information technology is the expansion of knowledge and its obverse, the reduction in uncertainty. Before this quantum jump in information availability, most business decisions were hampered by a fog of uncertainty. Businesses had limited and lagging knowledge of customers’ needs and of the location of inventories and materials flowing through complex production systems. In that environment, doubling up on materials and people was essential as a backup to the inevitable misjudgments of the real-time state of play in a company. Decisions were made from information that was hours, days, or even weeks old. Of course, large voids of information still persist, and forecasts of future events on which all business decisions ultimately depend will always be prone to error. But information has become vastly more available in real time - resulting, for example, from developments such as electronic data interface between the retail checkout counter and the factory floor or the satellite location of trucks. This surge in the availability of more timely information has enabled business management to remove large swaths of inventory safety stocks and worker redundancies. Stated differently, fewer goods and worker hours are now involved in activities that, although perceived as necessary insurance to sustain valued output, in the end produced nothing of value. Those intermediate production and distribution activities, so essential when information and quality control were poor, are being reduced in scale and, in some cases, eliminated. These trends may well gather speed and force as the Internet alters relationships of businesses to their suppliers and their customers, a topic to which I shall return in a moment. The process of information innovation has gone far beyond the factory floor and distribution channels. Computer modeling, for example, has dramatically reduced the time and cost required to design items ranging from motor vehicles to commercial airliners to skyscrapers. In a very different part of the economy, medical diagnoses have become more thorough, more accurate, and far faster. With access to heretofore unavailable information, treatment has been hastened, and hours of procedures have been eliminated. Moreover, the potential for discovering more-effective treatments has been greatly enhanced by the parallel revolution in biotechnology, including the ongoing effort to map the entire human genome. That work would have been unthinkable without the ability to store and process huge amounts of data. The advances in information technology also have been an impetus to the ongoing wave of strategic alliance and merger activity. Hardly a week passes without the announcement of another blockbuster deal. Many of these combinations arise directly from the opportunities created by new technology - for example, those at the intersection of the Internet, telecommunications, and the media. It is not possible to know which of the many new technologies will ultimately find a firm foothold in our rapidly changing economy. Accordingly, many high-tech companies that wish to remain independent are hedging their bets by entering into strategic alliances with firms developing competing technologies. In addition, the new technology has fostered full mergers that allow firms to take greater advantage of economies of scale and thus reduce costs. Without highly sophisticated information technology, it would be nearly impossible to manage firms on the scale of some that have been proposed or actually created of late. Although it will be a while before the ultimate success of these endeavors can be judged, information technology has almost certainly pushed out the point at which scale diseconomies begin to take hold for some industries. The impact of information technology has been keenly felt in the financial sector of the economy. Perhaps the most significant innovation has been the development of financial instruments that enable risk to be reallocated to the parties most willing and able to bear that risk. Many of the new financial products that have been created, with financial derivatives being the most notable, contribute economic value by unbundling risks and shifting them in a highly calibrated manner. Although these instruments cannot reduce the risk inherent in real assets, they can redistribute it in a way that induces more investment in real assets and, hence, engenders higher productivity and standards of living. Information technology has made possible the creation, valuation, and exchange of these complex financial products on a global basis. At the end of the day, the benefits of new technologies can be realized only if they are embodied in capital investment, defined to include any outlay that increases the value of the firm. For these investments to be made, the prospective rate of return must exceed the cost of capital. Technological synergies have enlarged the set of productive capital investments, while lofty equity values and declining prices of high-tech equipment have reduced the cost of capital. The result has been a veritable explosion of spending on high-tech equipment and software, which has raised the growth of the capital stock dramatically over the past five years. The fact that the capital spending boom is still going strong indicates that businesses continue to find a wide array of potential high-rate-of-return, productivity-enhancing investments. And I see nothing to suggest that these opportunities will peter out any time soon. Indeed, many argue that the pace of innovation will continue to quicken in the next few years, as companies exploit the still largely untapped potential for e-commerce, especially in the businessto-business arena, where most observers expect the fastest growth. An electronic market that would automatically solicit bids from suppliers has the potential for substantially reducing search and transaction costs for individual firms and for the economy as a whole. This reduction would mean less unproductive search and fewer workhours more generally embodied in each unit of output, enhancing output per hour. Already, major efforts have been announced in the auto industry to move purchasing operations to the Internet. Similar developments are planned or in operation in many other industries as well. It appears to be only a matter of time before the Internet becomes the prime venue for the trillions of dollars of business-to-business commerce conducted every year. There can be little doubt that, on balance, the evolving surge in innovation is an unmitigated good for the large majority of the American people. Yet, implicit in the very forces of change that are bringing us a panoply of goods and services considered unimaginable only a generation ago are potential financial imbalances and worker insecurities that need to be addressed if the full potential of our technological largesse is to be achieved. As I testified before the Congress last month, accelerating productivity entails a matching acceleration in the potential output of goods and services and a corresponding rise in real incomes available to purchase the new output. The pickup in productivity however tends to create even greater increases in aggregate demand than in potential aggregate supply. This occurs principally because a rise in structural productivity growth, not surprisingly, fosters higher expectations for long-term corporate earnings. These higher expectations, in turn, not only spur business investment but also increase stock prices and the market value of assets held by households, creating additional purchasing power for which no additional goods or services have yet been produced. Historical evidence suggests that perhaps three to four cents out of every additional dollar of stock market wealth eventually is reflected in increased consumer purchases. The sharp rise in the amount of consumer outlays relative to disposable incomes in recent years, and the corresponding fall in the saving rate, is a reflection of this so-called wealth effect on household purchases. Moreover, higher stock prices, by lowering the cost of equity capital, have helped to support the boom in capital spending. Outlays prompted by capital gains in equities and homes in excess of increases in income, as best we can judge, have added about 1 percentage point to annual growth of gross domestic purchases, on average, over the past half-decade. The additional growth in spending of recent years that has accompanied these wealth gains, as well as other supporting influences on the economy, appears to have been met in equal measure by increased net imports and by goods and services produced by the net increase in newly hired workers over and above the normal growth of the workforce, including a substantial net inflow of workers from abroad. But these safety valves that have been supplying goods and services to meet the recent increments to purchasing power largely generated by capital gains cannot be expected to absorb indefinitely an excess of demand over supply. Growing net imports and a widening current account deficit require ever-larger portfolio and direct foreign investments in the United States, an outcome that cannot continue without limit. Imbalances in the labor markets perhaps may have even more serious implications for potential inflation pressures. While the pool of officially unemployed and those otherwise willing to work may continue to shrink, as it has persistently over the past seven years, there is an effective limit to new hiring, unless immigration is uncapped. At some point in the continuous reduction in the number of available workers willing to take jobs, short of the repeal of the law of supply and demand, wage increases must rise above even impressive gains in productivity. This would intensify inflationary pressures or squeeze profit margins, with either outcome capable of bringing our growing prosperity to an end. In short, unless we are able to indefinitely increase the rate of capital flows into the United States to finance rising net imports or continuously augment immigration quotas, overall demand for goods and services cannot chronically exceed the underlying growth rate of supply. Our immediate goal at the Federal Reserve should be to encourage the economic and financial conditions that will best foster the technological innovation and investment that spur structural productivity growth. It is structural productivity growth - not the temporary rise and fall of output per hour associated with various stages of the business cycle - that determines how rapidly living standards rise over time. Achievement of this goal requires a stable macroeconomic environment of sustained growth and continued low inflation. That, in turn, means that the expansion of demand must moderate into alignment with the more rapid growth rate of potential supply. The current gap between the growth of supply and demand for goods and services, of necessity, has been reflected in an excess in the demand for funds over new savings from Americans, including those savings generated by rising budget surpluses. As a consequence, real long-term corporate borrowing costs have risen significantly over the past two years. Presumably as a result, many analysts are now projecting that the rate of increase in stock market wealth may soon begin to slow. If so, the wealth effect adding to spending growth would eventually be damped, and both the rate of increase in net imports as a share of GDP, and the rate of decline in the pool of unemployed workers willing to work should also slow. However, so long as these two imbalances continue, reflecting the excess of demand over supply, the level of potential workers will continue to fall and the net debt to foreigners will continue to rise by increasing amounts. Until market forces, assisted by a vigilant Federal Reserve, effect the necessary alignment of the growth of aggregate demand with the growth of potential aggregate supply, the full benefits of innovative productivity acceleration are at risk of being undermined by financial and economic instability. The second consequence of rapid economic and technological change that needs to be addressed is growing worker insecurity, the result, I suspect, of fear of potential job skill obsolescence. Despite the tightest labor markets in a generation, more workers currently report they are fearful of losing their jobs than similar surveys found in 1991 at the bottom of the last recession. The marked move of capital from failing technologies to those at the cutting edge has quickened the pace at which job skills become obsolete. The completion of high school used to equip the average worker with sufficient skills to last a lifetime. That is no longer true, as evidenced by community colleges being inundated with workers returning to school to acquire new skills and on-the-job training being expanded and upgraded by a large proportion of American business. Not unexpectedly, greater worker insecurities are creating political pressures to reduce the fierce global competition that has emerged in the wake of our 1990s technology boom. Protectionist measures, I have no doubt, could temporarily reduce some worker anxieties by inhibiting these competitive forces. However, over the longer run such actions would slow innovation and impede the rise in living standards. They could not alter the eventual shifts in production that owe to enormous changes in relative prices across the economy. Protectionism might enable a worker in a declining industry to hold onto his job longer. But would it not be better for that worker to seek a new career in a more viable industry at age 35 than hang on until age 50, when job opportunities would be far scarcer and when the lifetime benefits of additional education and training would be necessarily smaller? To be sure, assisting those who are already close to retirement in failing industries is an imperative. But that can be readily accomplished without distorting necessary capital flows to newer technologies through protectionist measures. More generally, we must ensure that our whole population receives an education that will allow full participation in this dynamic period of American economic history. These years of extraordinary innovation are enhancing the standard of living for a large majority of Americans. We should be thankful for that and persevere in policies that enlarge the scope for competition and innovation and thereby foster greater opportunities for everyone.
|
board of governors of the federal reserve system
| 2,000 | 3 |
Speech by Mr Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the Institute of International Bankers, Washington, D.C. on 6 March 2000.
|
Mr Ferguson comments on a number of aspects regarding the convergence of regulatory standards: a work in process Speech by Mr Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the Institute of International Bankers, Washington, D.C. on 6 March 2000. * * * It is a pleasure to be here and to address the members of the Institute of International Bankers. Foreign banks play a critical role in the US financial system, accounting for nearly one-quarter of total US banking assets throughout the 1990s. The Federal Reserve has consistently supported open US markets for foreign banks and has long recognized the value that you add to our economy and financial markets. Conferences such as this provide important opportunities for the banking and supervisory communities to meet with one another and to share our thoughts and concerns. In my comments today, I would like to discuss the apparent motivation for managers to create global financial institutions, which provides the background for a convergence of regulatory standards around the world. I would also like to mention briefly the approach that the Federal Reserve is taking with respect to foreign banks. Finally, I would like to focus on the emerging global regulatory standards, mainly the efforts of the Basel Committee on Banking Supervision to revise risk-based capital standards. Global consolidation of financial institutions The need for sound and consistent policies and procedures throughout the world has become more important as our financial markets and financial institutions have become larger, more complex, and more tightly integrated. If anyone needs proof of that statement, the spread of the Asian crisis during the course of 1997 and 1998 stands as the most obvious example of the growing integration of financial markets. Informed researchers would argue that the global consolidation of financial institutions appears to be driven by several factors. Originally, globalization probably reflected the desire of banks to serve their domestic customers as those customers themselves expanded overseas. This was probably a defensive measure designed to retain customers and preclude others from making inroads into longstanding customer relationships. For other financial institutions, the motivation for overseas expansion may well have been to fully leverage perceived comparative advantages in important business lines, such as custody, that were characterized by high fixed cost and economies of scale. A related motivation might have been to take advantage of cultural affinity, which itself is another form of comparative advantage. The strong presence of Spanish financial institutions in Latin America may be the prime example of this motivation. Finally, many institutions may have expanded to meet ambitious aspirations for growth that comparatively small local markets might not accommodate. Certainly, one might argue that the global reach of major Dutch and Swiss institutions reflects this motivation, at least in part. Importantly, one must ask if global institutions are more successful than their home-bound competitors. At least one private-sector study, from a major consulting firm, suggests that during the ten-year period ending 31 December 1996, most of the global financial services companies did not achieve significantly superior returns to shareholders. Nor did they appear to achieve superior revenue growth. I am sure that there will be other studies that attempt to determine the effect of global consolidation on the financial results of the firms involved. In any event, regardless of the presence or absence of financial success, the trend toward a more global financial industry will continue. Such market forces will provide a continuing impetus for continued cooperation and coordination among bank supervisors worldwide. Indeed, the forces for cooperation will only grow as these trends become more pronounced. The Gramm-Leach-Bliley Act Issues related to the international supervision and regulation of banks come at a time when we in the United States face new challenges in implementing financial reform and the provisions of the Gramm-Leach-Bliley Act. That act has the potential, and indeed the intention, to change substantially the structure, activities, and supervision of financial institutions in this country. Many of the foreign banks represented here today operate as universal banks outside the United States and, like many US banks, you have been frustrated by the outmoded restrictions on your banking activities in our markets. After much debate, the US Congress enacted legislation that permits banks operating in the United States to expand their activities within a legal and supervisory framework intended to preserve their safety and soundness. As you well know, the Federal Reserve Board recently issued for public comment an interim amendment to Regulation Y setting forth the procedures for banking organizations to elect to become financial holding companies and avail themselves of the broader powers authorized under the act. In according financial holding company status to foreign banks, the Congress instructed the Board to apply capital and managerial standards comparable to those pertaining to US banking organizations, giving due regard to the principles of national treatment and equality of competitive opportunity. The rule applies the US bank risk-based capital standards of 6% Tier 1 capital and 10% total capital to foreign banks wishing to become financial holding companies. It also applies the US leverage ratio to foreign banks, but at the lower holding company level of 3%, instead of the 5% ratio required of US banks. To consider differences in banking and accounting practices in many foreign countries, the Board also will assess the capital and management of foreign banks case by case. This assessment will take into account, when appropriate, a number of factors such as the bank’s composition of capital, accounting standards, long-term debt ratings, reliance on government support to meet capital requirements, and the extent to which the bank is subject to comprehensive, consolidated supervision. The intent of this approach is to provide the flexibility necessary to take into account all relevant factors in a way that will be equitable to all banks, foreign and domestic. We understand that there is concern that this procedure will be subject to delays, resulting in disadvantages to foreign banks. Let me assure you that we fully intend to deal with submissions from foreign banks expeditiously and in the same time frames that are provided for the review of submissions by US companies. If different procedures will allow us to meet the statutory requirements on comparability, we are very open to considering them. The Board recognizes that this interim rule is of great interest to foreign banks and that it raises complex issues - in particular, how to achieve comparability as required by the law while still respecting the home country supervisory framework. This balance is difficult to achieve, and the Board intends to give careful consideration to the comments it receives in response to the interim rule. The Board is committed to implementing this new law in a manner that is equitable and fair to all institutions and that ensures a sound and stable framework for the evolution of financial services in the United States. Basel Committee on Banking Supervision In supervising financial holding companies, the Federal Reserve will need to consider not only the Gramm-Leach-Bliley Act but also the policies and actions of other agencies in this country and abroad. Fortunately, we have been working together for several years to deal with issues arising from activities of financial conglomerates. I think that much of the experience we have gained through our participation in groups such as the Basel Committee on Banking Supervision, the Joint Forum, and the Financial Stability Forum will help us meet the challenges ahead. Because time is brief, I will focus the balance of my remarks on the work of the Basel Committee. The Federal Reserve has been actively involved in this committee since its inception in the mid-1970s, and the Basel Committee continues to take the lead in coordinating banking supervisory policies and practices globally. Although representatives from the G10 countries and Luxembourg do the work of the committee, it recognizes that supervisors in most of the non-G10 countries typically adopt the policies and principles that the committee adopts. As a result, the committee has sought to incorporate the views of supervisors throughout the world. A non-G10 working group, for example, is participating in the current revisions to the Capital Accord. As I know you are all aware, the Basel Committee is devoting a tremendous amount of time and resources to the effort to develop a new capital adequacy framework that is more sensitive to the level of underlying economic risk. Indeed, comments are due soon on a consultative paper issued last year on this topic. Feedback from the industry is important to the Basel Committee, and I hope that many of you will be communicating your ideas and reactions to the Bank for International Settlements. Capital requirements are an essential supervisory tool for fostering the safety and soundness of banks. The 1988 Basel Capital Accord was a major achievement in establishing a uniform standard for internationally active banks. In the years since, the committee has continued to develop and refine the standard in an effort to keep pace with banking practices and to maintain adequate levels of bank capital throughout the world. Many have asked why the Basel Committee is revising the accord at this time. There has been recognition from the start that the 1988 accord was rather crude and imperfect in many respects. Although that accord incorporates some differentiation in credit risk, it is limited. Moreover, the accord does not explicitly address interest rate risk, operational risk, or other risks that can be substantial at some banks. Consequently, some countries, including the United States, have put in place supplementary requirements - such as target ratios above the minimum levels - to help mitigate the accord’s shortcomings. For example, as a further prudential measure, the United States decided to apply a separate leverage constraint to provide some limit to leverage, regardless of what the risk-weighted Basel approach might allow. Beyond these initial and inherent imperfections of the accord, simply the dramatic innovations over the past decade in financial markets and in the ways in which banks manage and mitigate credit risk have driven the need for change. The committee has been concerned particularly about the incentives that the accord gives banks to take on higher-risk, higher-reward transactions, and to engage in regulatory capital arbitrage. Efforts to make the standard more sensitive to underlying risk should greatly reduce these incentives. Basel Committee Consultative Paper Last year’s consultative paper set out a new paradigm for judging capital adequacy based on a set of three so-called pillars. Pillar I is sound minimum capital standards or, in essence, the existing Accord with improvements. Pillar II is supervisory oversight of capital adequacy at individual banks, and pillar III is market discipline supported by adequate public disclosures by banks. These three pillars represent an evolution in the Basel Committee’s approach to capital adequacy and should be mutually reinforcing. The addition of pillars II and III acknowledges the importance of ongoing review by supervisors of the capital adequacy at individual banks and the critical role of market discipline in controlling the risk-taking of banks. The committee’s revisions to pillar I are aimed at developing minimum capital standards that more accurately distinguish degrees of credit risk and that are appropriate for banks of varying levels of sophistication. In its consultative document issued last June, the committee set out two possible approaches: a standardized approach that would tie capital requirements to external credit assessments, such as credit ratings, and another approach that would be based on a bank’s own internal ratings. The latter would derive a capital requirement from bank estimates of default probabilities and from estimated losses-given-default on individual exposures. Using such estimates would help greatly in making capital requirements more sensitive to different levels of risk but would also introduce more subjectivity and a lack of transparency into the process. Therefore, we may need to limit or constrain certain measures. How to validate the estimates will also be an issue, especially considering that banks, themselves, often have little historical data on which to base key assumptions and calculations. Comparability and competitive equity among banks and national banking systems will be important factors in the debate. Nevertheless, the two-pronged approach of offering both a simplified and a more complex method seems both necessary and reasonable in order to accommodate all types of banks. Even then, however, we must recognize that any standard will continue to evolve. Although I believe that an internal ratings-based approach would provide an important step forward, its results would still likely differ from those of a bank’s own economic capital allocation models. Questions about the correlation of risks among different asset groups and about how and whether to consider them in a regulatory capital standard are still unresolved and go to the heart of full credit risk models. Beyond credit risk is the highly complex matter of operating risk and other risks that are not explicitly dealt with in the accord. In the past, of course, the overt charge for credit risk has carried the full load of these other risks, but that has begun to change. Both the Basel Supervisors Committee and the international banking community need to address these topics more directly and more satisfactorily. Regardless of what regulators and supervisors do, you as bank managers must fully recognize and control your risks. As you make greater progress, so can regulators. In the past, relatively rough rules-of-thumb and traditional practices sufficed in supervising and managing banks. But just as derivatives have allowed you to unbundle risks and to price and structure financial products with more precision, similar technologies and innovations are requiring more precision in almost everything else you do. And also everything we do as bank supervisors. Opportunities for arbitrage within financial markets and capital regulations are easily found. What you do within the industry and what we do as bank supervisors must be more closely connected in all respects to the underlying economics. Meeting that challenge will keep all of us on our toes. Recognizing that supervisors need to relate capital requirements of individual banks more closely to their unique risk profiles, the Basel Committee’s second pillar - the supervisory review of capital emphasizes principles such as these: • that supervisors should expect, and have the authority to require, banks to operate above the minimum regulatory capital ratios • that they should require their banks to assess and maintain overall capital adequacy in relation to underlying risks • that supervisors should review and evaluate the internal capital adequacy assessments and strategies of banks, as well as their compliance with regulatory capital ratios • that supervisors should intervene at an early stage to prevent capital from falling below prudent levels and should require remedial action quickly if capital becomes inadequate. I believe it is essential to have effective supervisory oversight and assessment of individual bank capital as a complement to meeting regulatory capital requirements. This does not mean, however, that supervisors have ultimate responsibility for determining the adequate level of capital at each bank nor that supervisory judgment should replace that of bank management. Rather an active dialogue should take place between bank management and supervisors with regard to the optimum levels of capital. Pillar II thus moves the accord beyond a simple ratio-based standard to a more comprehensive approach for assessing the adequacy of capital levels. The supervisory review of capital called for in pillar II obviously will have resource implications for supervisors around the world and may require significant changes in supervisory cultures and techniques in many countries, both G10 and non-G10. The committee will need to develop guidance for bank supervisors to use when evaluating the adequacy of internal capital assessment processes. A residual benefit of such evaluations will be that supervisors will more easily stay current with evolving industry practices related to risk management and will better understand the risks that individual banks face. The third element of the proposed new capital framework - pillar III - relates to market discipline, which I believe all supervisors recognize as a critical complement to their supervisory oversight process. When banks disclose timely and accurate information about their capital structure and risk exposures, market participants can better evaluate their own risks in dealing with such institutions. Greater market discipline, in turn, gives banks more incentive to manage their risks effectively and to remain adequately capitalized. Recognizing that current disclosure practices in some countries are relatively weak, the Basel Committee under this pillar is working on guidelines that would make banking risks more transparent. The goal is to protect legitimate proprietary information, while promoting more consistent disclosure among nations. Adequate disclosure becomes even more important as we base regulatory capital requirements on internal risk measures of banks. Clearly, more information, by itself, is not always better. Working with the industry, we need to decide which specific elements are needed to do the job. Indeed, an ongoing partnership between banks and supervisors is crucial to the success of any regulatory capital standard and to the success of the supervisory process overall. It is in everyone’s interest that we succeed in this effort and that the international financial system remain sound. Other Basel Committee initiatives Although the Basel Committee may be best known for its work on capital standards, its efforts extend well beyond that - as suggested by the two other pillars. Its development of the Core Principles for Effective Banking Supervision in 1997 is particularly noteworthy. These twenty-five principles cover a broad range of supervisory issues involving licensing and supervising banks and enforcing supervisory judgments. By setting reasonable thresholds for standards that banking supervisors in all countries should achieve, the committee has substantially helped to promote financial stability worldwide. Of course, the challenge now is to help all countries meet these core principles, despite the limited expertise and resources some may have. Fortunately, the International Monetary Fund and World Bank are working with the committee and can be of significant help. The securities and insurance supervisors have taken similar steps in developing supervisory principles of their own that should contribute to stronger supervisory regimes worldwide and provide a framework for further harmonization with banking standards, when appropriate. Conclusion In closing, I see no shortage of difficult challenges facing financial institution supervisors in the period ahead. Clearly, supervisors of the various sectors of the financial industry - banking, insurance, and securities - will continue to be confronted with rapid and dramatic changes in banking and financial markets. Supervisors will need to react to technological innovations, expansion of financial institutions into new and increasingly more complex activities, and ongoing consolidation within the industry worldwide. A rigorous, coordinated supervisory approach will be necessary to counterbalance the pressures of an increasingly dynamic and competitive marketplace. I am confident that by working together and with the financial industry supervisors we can meet the challenge. I want to assure you that, as central bankers, we at the Federal Reserve have strong interests in maintaining efficient, well-managed, and responsible financial institutions. I wish you all well in the years ahead. Thank you for your attention.
|
board of governors of the federal reserve system
| 2,000 | 3 |
Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the Joint Conference of the Federal Reserve Bank of San Francisco and the Stanford Institute for Economic Policy Research, Federal Reserve Bank of San Francisco, San Francisco, on 3 March 2000.
|
Mr Meyer remarks on structural change and monetary policy in the United States Remarks by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, before the Joint Conference of the Federal Reserve Bank of San Francisco and the Stanford Institute for Economic Policy Research, Federal Reserve Bank of San Francisco, San Francisco, on 3 March 2000. * * * Structural change is a central theme in virtually any explanation of the exceptional performance of the US economy over the past several years. Structural changes of uncertain magnitude and timing have increased the difficulty in forecasting, undermined confidence in our understanding of the structure of the economy, and increased the risk of measurement error with respect to key variables. In my remarks, I will offer a bridge from today’s discussion of structural changes to the implications for monetary policy, the subject of tomorrow’s agenda. In my view, the most important challenges to monetary policy related to structural change in this episode arise from possible changes in aggregate supply - specifically in the non-accelerating inflation rate of unemployment (NAIRU) and in trend growth. The key challenge for monetary policymakers during this period, in my view, has been to allow the economy to realize the full benefits of the new possibilities while avoiding an overheated economy. More fundamentally, the challenge has been to adapt the strategy of monetary policy in light of the uncertainties associated with structural change. My focus is therefore not on structural change per se but rather on the uncertainty about key parameters likely to be heightened during a period of structural change. However, the key structural change during this episode - an increase in the underlying productivity growth trend - has also set in motion a complex of effects on inflation, interest rates, equity prices, and aggregate demand. Even if we knew the precise value of the higher productivity trend, we would likely remain uncertain about the size and persistence of many of its effects. As a result, adapting monetary policy to a higher trend rate of productivity growth would be a challenge, especially in an interest-rate setting regime, even if there were no uncertainty about the new underlying growth trend. Perspectives on monetary policy strategy There are, in my view, two fundamental requirements of a prudent monetary policy. First, monetary policy should impose a nominal anchor, pinning down the long-run inflation rate. Second, monetary policy should lean against the cyclical winds. The second requirement contributes to the first and also to smoothing fluctuations in output around full employment. This view of the mission of monetary policy is consistent with both the dual mandate for the Federal Reserve in the Federal Reserve Act and with flexible inflation targeting regimes in many countries. The key to the practice of monetary policy is to develop a strategy for the discretionary conduct of policy that meets these requirements. A constructive way to describe such a strategy is to formalize it in terms of an explicit rule. John Taylor has offered an attractive and simple form of such a rule. But perhaps equally important, John’s approach has encouraged a wider acceptance of the study of rules by emphasizing that the objective is to inform discretionary monetary policy decisions rather than to replace discretion by a rule. I find the Taylor rule attractive because it is closely aligned both with the objectives of monetary policy and with the model that governs inflation dynamics. That is, the rule responds directly to deviations from the Federal Reserve’s objectives - price stability and an equilibrium utilization rate. And it incorporates a preemptive response to inflation that is consistent with models that assign an important role to unemployment or output gaps in inflation dynamics. However, implementing this strategy requires knowledge of the output gap and the equilibrium real interest rate - variables that appear to have been affected by structural change. As a result, there has been increased focus on how Taylor-type rules should be adjusted in light of the uncertainties associated with structural change. Uncertainty and monetary policy strategy Given uncertainty about the output gap, for example, should we attenuate the response to the output gap or even entirely abandon the output gap as a guide to adjustment in monetary policy? Given the related difficulty in forecasting during a period of structural change, should we be less forward-looking and hence less preemptive? If we are less preemptive, should we compensate by being more aggressively reactive to recent inflation? Or should we more fundamentally change the specification of the policy rule when confronted with these uncertainties? For example, would we minimize the damage from mis-estimates of the NAIRU and trend growth using a nominal income rule instead of a Taylor rule? Or should we allow for a nonlinear instead of a linear policy response to movements in output and inflation? There is, of course, a well-developed literature on the effect of uncertainty on policy. Until recently, I viewed the literature as delineating two simple types of uncertainty, typically referred to as additive uncertainty and multiplicative or parameter uncertainty. Recently, the literature has focused on uncertainty associated with imperfect or noisy observation of the economy in ways that do not neatly fall into the two simple bins. Certainty equivalence holds in the case of simple specifications of additive uncertainty. When certainty equivalence holds, it is optimal for policymakers to respond to the expected values of their targets as if they held these values with complete certainty. In a sense, uncertainty has no effect on policy in this case, though it results in some decline of its effectiveness. Additive uncertainty and certainty equivalence are perhaps best seen as devices to allow the incorporation of some stochastic elements into a model without the complications that arise in more meaningful encounters with uncertainty. In the case of multiplicative uncertainty, the most well known result is William Brainard’s conclusion that policy should be somewhat more cautious in this case. Assuming policymakers don’t like uncertainty, they become less aggressive with their policy instruments, because bolder use of policy adds to uncertainty about outcomes. However, some recent evidence (Arturo Estrella, Rick Mishkin, and Glenn Rudebusch) suggests that this type of uncertainty may have a relatively modest quantitative effect on the policy outcome. Also, newer theoretical results, such as Tom Sargent’s, question the conclusion that parameter uncertainty would make policy more cautious. The newer entry into the uncertainty literature relates to imperfect or noisy observation of the economy, although concern about this problem certainly predates the recent studies. For example, by examining the historical record at the Federal Reserve, Athanasios Orphanides uncovered substantial and persistent measurement error associated with estimates of the output gap - one of the measures we sometimes identify with “excess demand”. This uncertainty starts by looking a lot like additive uncertainty, but its policy implications often end up similar to the Brainard result of more cautious policy, with policy response attenuated at least with respect to movements in variables about which there are noisy observations. The literature supporting this attenuation result has at least two strands. One consists of theoretical models based on signal extraction, as in the work that Eric Swanson and Lars Svensson and Michael Woodford are presenting at this conference tomorrow. Suppose, as in Swanson’s work, that inflation depends on an unobservable variable we call “excess demand” and policy responds to the unemployment rate, which is only an imperfect indicator of “excess demand”. Since the unemployment rate is a noisy indicator of what the policymaker is interested in - the unobservable “excess demand” - the weight the policymaker will give this variable will vary with its reliability as an indicator for excess demand. Specifically, the less reliable the indicator becomes, the smaller its weight will be in the optimal policy rule and the more weight will be placed on the other indicators about which uncertainty has not changed. In the current context, that means that the weight on the unemployment rate is decreased, while the weight on inflation is increased. In effect, as policy becomes less preemptive in stabilizing inflation, it becomes more aggressive in reacting to recent inflation. The second strand of the literature that supports the attenuation result is based on simulation results, as reflected in the work of Orphanides, Rudebusch, Frank Smets, and others. This work employs simple empirical models and a simple rather than an optimal rule and examines how policymakers should adjust the parameters of the simple rule in light of the uncertainty about the measurement of the output gap. It finds that policymakers should attenuate their response to changes in the output gap and, indeed, should move very cautiously when the confidence with which sure measures can be constructed is low. In contrast with the conclusions based on signal extraction models, the simulations results using simple rules generally finds that increased uncertainty about the output gap may call for attenuation in the response to both the gap variable and inflation. In some cases, certainty equivalence continues to hold, even with noisy observations. In Swanson’s work, for example, optimal policy still displays certainty equivalence when policy is related to the unobservable excess demand. In the paper that Svensson and Woodford will present tomorrow, where the model relates inflation directly to the observed unemployment gap, certainty equivalence holds provided that the estimate of the gap is updated on the basis of all available data and the true model. This structure and result are also present in the work by Orphanides. So the question is: How general or special is the attenuation result? This question appears particularly relevant to the uncertainties that monetary policymakers are wrestling with today, and I am sure we will have a lot of discussion about this conclusion tomorrow. I suspect the result is a general one for the following reasons. First, I believe part of the challenge today is finding a proxy for the unobservable excess demand, especially given the divergent movements in the unemployment and capacity utilization rates. Therefore, in my view, Swanson’s conclusion that certainty equivalence holds when policy is expressed in terms of the unobservable “excess demand” is dominated by his conclusion that attenuation holds when policy is made in terms of observables. Put simply, policy authorities are mortals and hence are unable to observe unobservables. Second, given that we don’t know the true model, policymakers might look at simple rules rather than try to derive optimal rules for guidance. This leads me to question the practical significance of certainty equivalence, which requires that policymakers know the true model, use an optimal rule, and update their optimal estimate of the NAIRU based on the true model. I draw the following conclusions from this research. First, policymakers should continuously update their estimates of the NAIRU and the output gap, using all available information, particularly the realizations of unemployment, output, and inflation. Such updating will not entirely erase the problems associated with noisy observations, but it will mitigate them. In my view, policymakers today update their estimates of the output gap and the NAIRU more systematically and more frequently than they once did. This view suggests some caution in deriving the degree of attenuation from historical evidence of revisions to the NAIRU and potential output. Second, policymakers should adjust the aggressiveness of their response to the gaps between actual and target variables in light of the uncertainty about their measurement. Specifically, policymakers should attenuate their response to movements in the unemployment or output gap. There is an important complication in applying this principle. Simulation results suggest that the optimal response to the output gap in the absence of uncertainty might be considerably more aggressive than the parameter in the Taylor rule. The attenuation might therefore result in a response parameter closer to or lower than the Taylor rule value. Third, the literature is less clear about whether policymakers should offset any attenuation in the response to the output gap with a more aggressive response to movements in realized inflation. My instinct tells me that, as policy becomes less preemptive, it should become more aggressively reactive. Taking the second and third conclusions together, the relative weights on the gap variable and inflation should vary, depending on the degree of uncertainty about the output gap. The higher coefficient on the inflation rate might be justified by the fact that inflation has become a better indicator of the excess demand compared with the output gap when there is heightened uncertainty about the measurement of the output gap. The focus of the literature has been on uncertainty about the unemployment or output gap, but a shift in trend productivity growth also results in uncertainty about the equilibrium real interest rate. In this case, a Taylor-type rule should also incorporate some mechanism for updating the estimate of this rate. A nonlinear Taylor rule under uncertainty about key parameters The literature on noisy observations has focused on adjustments in the parameters of linear Taylor-type rules. But I believe that a nonlinear rule may dominate a linear specification in this case. I have suggested a nonlinear rule that would attenuate the response to the unemployment rate in a region around the best estimate of the NAIRU but would cause a gradual return to the more aggressive marginal response appropriate under certainty equivalence once the unemployment rate had moved sufficiently below the best estimate of the NAIRU. Such a nonlinear rule could be justified either by nonlinearities in the economy or by a non-normal distribution of policymakers’ prior beliefs about the NAIRU. It is certainly easy to believe that there are nonlinearities in the economy in general and with respect to the Phillips curve in particular. For example, to the extent that the effect on inflation becomes disproportionately larger as the unemployment or output gap increases, the policy response should become more aggressive with each incremental increase in the gap. However, I’m not persuaded that there is a strong case for a nonlinear Phillips curve. So I am inclined to emphasize the possibility of a non-normal distribution of prior beliefs about the NAIRU as the basis for a nonlinear policy rule. An example of a non-normal probability distribution for the NAIRU that would justify the nonlinear policy response I have suggested is one with a uniform probability distribution around the best estimate for the NAIRU. For example, policymakers might have a prior of 5% for the NAIRU, but a uniform probability distribution over the range between 4½% and 5½% for the unemployment rate. Because policymakers are so uncertain about the NAIRU within this interval, they might be very willing to revise their estimate of the NAIRU about in line with the observations of the unemployment rate within it. As a result, movements of the unemployment rate within this range would have little effect on the estimate of the unemployment gap and, therefore, on the target interest rate. However, if the unemployment rate moved outside this range, policymakers might assign an increasingly smaller fraction of each increment of the unemployment rate to the NAIRU as the unemployment rate moved still further from policymakers’ best estimate of the NAIRU. In this case, the policy response is attenuated around the best estimate of the NAIRU, but it gradually becomes larger, ultimately converging to the marginal response under certainty equivalence. Monetary policy’s adjustment to uncertainty about key parameters Is the recent monetary policy response consistent with the lessons I have drawn from the literature on uncertainty associated with noisy observations? The following discussion draws on the evolution of my own thinking, as well as on the policy actions, including the announced tilts in policy and the text of the announcements that accompanied policy actions. I pick up this episode in the middle of 1996, when I joined the Board. It seems to me that initially monetary policy was consistent with a backward-looking Taylor rule (although that is sometimes difficult to distinguish from a forward-looking Taylor rule). We were faced with two surprises: faster-than-expected growth (resulting in a higher-than-expected estimated output gap) and lower-than-expected inflation. These had offsetting effects on the nominal funds rate, yielding a nearly unchanged policy until the fall of 1998 and a policy that closely tracked the Taylor rule prescription, at least allowing for updates of the estimated unemployment or output gaps along the way. Alternatively, this policy could be viewed as allowing a passive rise in the real rate that turned out to be well calibrated to the rise in the output gap. As the episode progressed, more questions were raised - both inside and outside the Federal Reserve about the values of the NAIRU and trend growth. The staff continually adjusted its estimates of these parameters in response to incoming data. In the fall of 1998, monetary policy responded both to the financial market distress and to the abrupt change in the forecast for growth (and hence utilization rates) in the United States. I put more weight on the forecast and less on recent observations in the context of a Taylor rule, a stance I thought was justified by the abrupt change in the forecast and by the unusually sharp contrast between the forecast and the still-strong incoming data. Once it became apparent that the US economy was maintaining its momentum despite weaker foreign growth and that financial markets had returned toward normal, the growing uncertainty about the output gap - reflecting the continuing contradiction of declining inflation and rising output gaps made monetary policymakers cautious about aggressively reversing their policy actions. But through early 1999 we remained somewhat concerned about the degree of recovery in both financial markets and foreign economies. The net result was that the nominal funds rate remained constant during this phase, instead of tightening in line with the Taylor rule prescription. In effect, policymakers could be interpreted as attenuating their response to the unemployment and output gap in line with the theoretical models and empirical results I have talked about. Beginning in mid-1999, with the estimated output gap increasing further and growth shifting to a still-higher gear, policymakers became more concerned about the possibility of overheating and, hence, the risks of higher inflation. The tightenings in 1999 could be interpreted as unwinding the earlier easings, once the factors that motivated the easings had themselves reversed. Of course, every policy action needs to be defended in terms of its contribution in the future to achieving the objectives of monetary policy. In this spirit, I interpreted the tightening moves as preemptive attempts to limit inflation risks. Why did policymakers tolerate for a while further increases in the output gap, and why did they subsequently become more concerned about the inflation risks from further increases in the output gap? I think the change can be rationalized in terms of my discussion of the case for a nonlinear policy response under uncertainty. As the unemployment rate fell farther below the best estimates of the NAIRU and the risk of overheating increased, policymakers became less tolerant of continued above-trend growth. Monetary policy strategy in light of the uncertainties associated with structural change Looking backward, I think we can find at least a hint of attenuation of the response to changes in the unemployment rate and, more recently, a hint of a nonlinear policy response. What does this suggest about monetary policy strategy going forward? The current strategy can, I believe, be viewed as a two-step process. The first step is, preemptively, to slow growth to trend. If successful, this step would limit, though not necessarily remove, the threat of overheating, if output has already advanced beyond potential. The second step is to respond reactively to higher inflation, should the prevailing output gap prove to be inconsistent with stable inflation. The first step is a continuation of the strategy underlying the recent policy tightenings. In my judgment, the unemployment rate has already declined to a sufficiently low level relative to my estimate of the NAIRU that we should no longer be attenuating the marginal policy response to further declines. But the current policy is, in my view, also an aggressive version of such a strategy because it is not a nonlinear response to further declines in the unemployment rate, but a forward-looking attempt to prevent further tightening of the labor market. I think that one of the subtleties of policy is sometimes being content to respond incrementally to the incoming data and sometimes becoming more aggressive and responding to forecasts. It is best that the policymakers are transparent about such shifts in the relative weight on the forecast in their policy decisions. Once growth has slowed to trend and the output gap stabilizes, monetary policy may become more reactive, given the continued uncertainty about the levels of the NAIRU and the output gap. That is, policymakers might be prepared to slow the economy to trend growth to avoid the risk of higher inflation associated with still-lower unemployment rates and higher output gaps, but might be reluctant to reduce the perceived output gap without evidence from realized inflation that the prevailing gap is unsustainable. Under such a policy, the response to inflation should, in my view, be more aggressive than it would otherwise be, for example, in the Taylor rule under certainty. This is an example of offsetting the attenuation in the response to the output gap with a more aggressive response to inflation realizations. In effect, the policy setting at trend growth and at the prevailing level of the output gap is presumed to be consistent with stable inflation. An increase in inflation (specifically in core inflation) would be evidence that the output gap is not in fact sustainable. As a result, the increase in interest rates should be the combined response to a slight increase in the estimate of the NAIRU and to an increase in the inflation rate at an unchanged estimate of the NAIRU. A final component of the strategy, in my view, should be that policy should tighten further - above and beyond what is presumed to be necessary to slow the economy to trend - to the extent that efforts to stabilize the output gap fall short. For example, let us assume that growth ultimately moves to trend but, in the interim, the continued above-trend growth increases the output gap still further. In response, policy should tighten incrementally, encouraging below trend-growth and hence unwinding the further increase in the output gap. The strategy I have described would reduce the prospects of policy responding to noise is estimates of key real-side variables in the economy and would increase the prospects of allowing the economy to realize the full benefits of the recent improvements in aggregate supply. However, it does risk allowing excess demand to build until it shows up in inflation and may ultimately require a more aggressive response of interest rates, if the range of attenuation does not in fact correspond to a decline in the NAIRU. Conclusion Structural change complicates the task of monetary policy. Of course, it is not difficult to put up with this additional burden when the structural change takes the form of a decline in the NAIRU and an increase in trend productivity growth. It would not be easy for monetary policy to turn such good fortune into poor macroeconomic performance. But the uncertainties about the nature and degree of structural change confront policymakers with the task of striving to realize the benefits of a decline in the NAIRU and an increase in trend growth while trying to avoid the inflationary consequences of overtaxing the new limits. Recent work on signal extraction models and on the implications of noisy observations provides some important guidance about how to adjust the strategy of monetary policy in the face of the new uncertainties. This conference provides a timely opportunity to assess what we have learned and how it might be applied to monetary policy today, as well as to point to areas that may deserve further exploration. References Brainard, W: “Uncertainty and the Effectiveness of Policy”, American Economic Review, vol. 57 (1967), pp. 411-25. Estrella, A, and F Mishkin: “Rethinking the Role of NAIRU in Monetary Policy: Implications of Model Formulation and Uncertainty”, in J B Taylor, ed., Monetary Policy Rules. Chicago: NBER and University of Chicago Press, 1999. Orphanides, A: Monetary Policy Evaluation with Noisy Information, Finance and Economics Discussion Series No. 1998:50. Washington, D.C.: Board of Governors of the Federal Reserve System, November 1998. Rudebusch, G: “Is the Fed Too Timid? Monetary Policy in an Uncertain World”, mimeo, Federal Reserve Bank of San Francisco, April 1999. Sargent, T: “Discussion of ‘Policy Rules for Open Economics’ by Lawrence Ball”, in J B Taylor, ed., Monetary Policy Rules. Chicago: NBER and University of Chicago Press, 1999. Smets, F: Output Gap Uncertainty: Does It Matter for the Taylor Rule? BIS Working Paper No. 60, November 1998. Svensson, L, and M Woodford: “Indicator Variables for Monetary Policy”, mimeo, Princeton University, February 2000. Swanson, E: “On Signal Extraction and Non-Certainty Equivalence in Optimal Monetary Policy Rules”, mimeo, Board of Governors of the Federal System, March 2000.
|
board of governors of the federal reserve system
| 2,000 | 3 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Annual Conference of the National Community Reinvestment Coalition, held in Washington, DC, on 22 March 2000.
|
Mr Greenspan remarks on some of the economic challenges facing the United States in the new century Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Annual Conference of the National Community Reinvestment Coalition, held in Washington, DC, on 22 March 2000. * * * Over the past several days, you have been engaged in sharing a good deal of practical information on developments in the financial services industry and on the evolving set of laws and regulations that influence the availability of credit in the communities that you serve. No doubt, many of you are here this morning because of your long-standing interest in the Federal Reserve’s implementation of the Community Reinvestment Act (CRA). However, because we are now in the final stages of drafting regulations on the Sunshine Provisions of the Gramm-Leach-Bliley Act, I am prohibited from commenting at this time. Instead, I would like to discuss with you, in broader terms, some of the challenges facing businesses, workers, and consumers - including those in your communities - as the U.S. economy embarks on a new century. As you know, we have recently established a record for the longest economic expansion in this nation’s history. In recent years, it has become increasingly clear that this business cycle differs in a very profound way from the cycles that have characterized post-World War II America. Not only has the expansion achieved record length, but it has done so with far stronger growth than expected. A key factor behind this impressive performance has been the remarkable acceleration in labor productivity, with output per hour in the nonfinancial corporate sector increasing since 1995 at nearly double the average pace of the preceding quarter-century. And because technological change has spawned so many opportunities for businesses to expand the range and value of their goods and services, the introduction of new efficiencies has not led to higher unemployment. Rather, the recent period of technological innovation has created a vibrant economy in which opportunities for jobs and new businesses have expanded, enhancing the living standards of a large majority of Americans. Our challenge, then, is to ensure that we - both policymakers and community leaders - extend the favorable macroeconomic performance and strive to bolster the capabilities of all Americans to share in the prosperity that is being generated. When historians look back at the latter half of the 1990s a decade or two hence, I suspect that they will conclude that we are now living through a pivotal period in American economic history. New technologies that evolved from the cumulative innovations of the past half-century have now begun to bring about dramatic changes in the way goods and services are produced and in the way they are distributed to final users. How did we arrive at such a fascinating and, to some, unsettling point in history? While the process of innovation, of course, is never-ending, the development of the transistor after World War II appears in retrospect to have initiated a special wave of innovative synergies. It brought us the microprocessor, the computer, satellites, and the joining of laser and fiber-optic technologies. By the 1990s, these and a number of lesser but critical innovations had, in turn, fostered an enormous new capacity to capture, analyze, and disseminate information. It is the growing use of information technology throughout the economy that makes the current period unique. For the consumer, advances in technology and in the flow of information have greatly facilitated the development of a wide range of new financial products that are better suited to meeting the preferences of diverse populations. Similarly, in the case of consumer and business credit, computer and telecommunications technologies - the same forces that are shaping the broader economy - have lowered the cost and broadened the scope of financial services. As a consequence of these developments, borrowers and lenders are increasingly able to transact directly with each other, and we have seen a proliferation of specialized lenders and new financial products that are tailored to meet very specific market needs. At the same time, the development of credit-scoring models and the securitization of pools of loans hold the potential for opening the door to national credit markets for a broad spectrum of businesses operating in local and regional markets. Indeed, the CRA data on small business lending show that institutions located outside the local community are an important source of credit for many businesses. Much attention is focused on the role of corporate giants in fostering innovation, but we would be foolish to understate the extent to which America’s innovative energy draws, and will continue to draw, from the interaction of both large and small businesses. Nowhere in the world are the synergies of small and large businesses operating side by side in a dynamic and competitive market economy more apparent than in this country. Of course, the surging growth of young high-tech firms and the flashy presence of new Internet businesses capture the most public attention. But judging from our contacts through our regional Federal Reserve Banks and information collected in surveys of small businesses, times have been good for expanding traditional lines of business as well. The most common complaints include the difficulty of finding qualified workers in the midst of strong competing demands for labor. In the current expansion, the vast majority of small businesses have not listed access to credit as their top concern, although, as you know, many business owners are quite apprehensive about the future as the familiar ways of financing business undergo sometimes dramatic changes. Several recent developments hold the promise of improving links between financial institutions and the small businesses in your communities. First, major banks and finance companies are trying mass-market approaches to small business finance, similar to the approaches used in the consumer credit arena for many years, and this effort has greatly expanded the competition for loans. In addition, new innovative intermediaries - such as community development corporations and multibank and investor loan pools - are seeking to develop expertise in specific segments of the marketplace for small and minority businesses. I would like to emphasize, however, that credit alone is not the answer for small businesses. They must have equity capital before they are considered viable candidates for debt financing. Equity acts as a buffer against the vagaries of the marketplace, and it is, accordingly, a sign of the creditworthiness of a business enterprise and the commitment of its owner. This is especially true in lower-income communities, where the weight of expansive debt obligations on small firms can severely impede growth prospects or more readily lead to business failures. Overall, our evolving economic and financial systems have been highly successful in promoting growth and higher standards of living for the majority of our citizens. But we need to reach further to engage those who have not been able to participate. One way is through the education and training of our workforce - that is, enhancing our stock of “human capital,” which is a necessary complement to our ever-changing physical capital. A major consequence of the fast-paced technological change of recent years and the growth of the conceptual emphasis of our nation’s output has been to increase the demand for skilled workers. In today’s economy, skill has taken on a much broader meaning than it had only a decade or two ago. Today’s workers must be prepared along many dimensions - not only with technical know-how but also with the ability to create, analyze, and transform information and with the capacity to interact effectively with others. Moreover, they must recognize that, with new technologies coming rapidly on line, the skills that they develop today will likely not last a lifetime. Traditionally, broader human capital skills have been associated with higher education, and accordingly the demand for college-trained workers has been increasing rapidly. The result has been that, over the past several decades, the economic returns to workers with college training have on average outstripped those to workers who stopped their formal schooling with a high-school diploma or less. In the past few years, real wage gains for college-educated workers have continued to be rapid, but owing to dynamic economic growth and tightening labor markets, increases for other workers, on average, have kept pace. Nonetheless, a wide gap between the wages of college-educated workers and those of high-school-trained workers remains. Another consequence of rapid economic and technological change that needs to be addressed is a higher level of worker insecurity, which is the result, I suspect, of fears of potential job skill obsolescence. Despite these tightest labor markets in a generation, more workers currently report that they are fearful of losing their jobs than similar surveys found in 1991 at the bottom of the last recession. The marked move of capital from failing technologies to those at the cutting edge has quickened the pace at which job skills become obsolete. The completion of high school once equipped the average worker with sufficient skills to last a lifetime. That is no longer true, as evidenced by the trends in workers returning to school and in businesses expanding and upgrading their on-the-job training. Certainly, higher education will continue to play an important role in preparing workers to meet the evolving demands for skilled labor. But the pressure to enlarge the pool of skilled workers requires that we recognize the significant contributions of other educational programs in your communities. Community colleges, for example, have become an important provider of job skills training not just for students who may eventually move on to a four-year college or university but for individuals with jobs - particularly older workers seeking to retool or retrain. In some cases, community colleges are providing contract training for employers, part of a broader trend in which employers and their workers are recognizing that to maintain human capital, investment in formal training programs must complement experience on the job. As one might expect, greater worker insecurities are also creating political pressures to reduce the fierce global competition that has emerged in the wake of our 1990s technology boom. Protectionist measures, I have no doubt, could temporarily reduce some worker anxieties by inhibiting these competitive forces. However, over the longer run such actions would slow innovation and impede the rise in living standards. They could not alter the eventual shifts in production that owe to enormous changes in relative prices across the economy. Protectionism might enable a worker in a declining industry to hold onto his job longer. But would it not be better for that worker to seek a new career in a more viable industry at age 35 than to hang on until age 50, when alternative job opportunities would be far scarcer and when the lifetime benefits of additional education and training would be necessarily smaller? To be sure, assisting those who are already close to retirement in failing industries is an imperative. But that can be readily accomplished without distorting necessary capital flows to newer technologies through protectionist measures. More generally, we must ensure that our whole population receives an education that will allow full participation in this dynamic period of American economic history. No doubt, in your communities many workers may view the changing needs of their employers as a threat to the security of their job; and perhaps students preparing to enter the workforce see the demand for rising skills as a hurdle too high to overcome with the limited resources available to them. You, as community leaders, can continue to explore ways of developing creative linkages between businesses and educational institutions to better prepare students for the rising demands of the workplace and to help workers, who must keep up with those changing demands and who must cope with the consequences of global competition, renew and upgrade their skills. As I indicated earlier, one notable aspect of the remarkable performance of our economy in recent years has been the substantial, and relatively broadly based, rise in real income. During the past several years, workers, including those at the low end of wage distribution, have seen noticeable increases in the inflation-adjusted value of their wages; more comprehensive Census Bureau figures on the real money income of families also show gains in each quintile between 1996 and 1998, and presumably when the 1999 data become available further improvement will be evident. These recent increases for low-income workers, however, have not reversed the rise in wage inequality that occurred during the 1980s and early-1990s when the gap in wages between those at the top and the bottom of the distribution was widening considerably. Nonetheless, the leveling off in that disturbing trend is an encouraging sign of what can be achieved if we can maintain strong and dynamic labor markets accompanied by low inflation. Of course, we need also to consider trends in wealth, which, more fundamentally than earnings or income, represent a measure of the ability of households to consume. The Federal Reserve’s Survey of Consumer Finances indicates that the median real net worth of families increased 17½% between 1995 and 1998. As one might expect, the rising stock market coupled with the spreading ownership of equities was an important factor. However, even in the face of the strong aggregate trend, median net worth declined over this period for families with incomes below $25,000, and medians for non-whites and Hispanics were little changed. That families with low-to-moderate incomes and minorities did not appear to fully benefit from the highly favorable economic developments of the mid-1990s is, of course, troubling, and the survey results warrant a closer look. In the details, we find that families with incomes below $25,000 did increase their direct or indirect holdings of stock, and more reported that they had a transactions account. However, they were less likely to hold nonfinancial assets - particularly homes, which constitute the bulk of the value of assets for those below the top quintile according to income. At the same time, one encouraging finding from the survey is that the homeownership rate among minorities rose from 44 percent to 47 percent between 1995 and 1998, which may be a sign of improved access to credit for minorities. Although market specialization, competition, and innovation have vastly expanded credit to virtually all income classes, under certain circumstances this expanded access may not be entirely beneficial, either for customers in general or for lower-income communities. Of concern are abusive lending practices that target specific neighborhoods or vulnerable segments of the population and can result in unaffordable payments, equity stripping, and foreclosure. The Federal Reserve is working on several fronts to address these issues and recently convened an interagency group to identify aberrant behaviors and develop methods to address them. I have no illusions that the task of breaking down barriers that have produced disparities in income and wealth will be simple. It remains an important goal because societies cannot thrive if significant segments perceive their functioning as unjust. Although we have achieved much in this regard, more remains to be done. Despite the considerable progress evident in recent decades in reducing racial and other forms of discrimination, this job is far from complete. Discrimination is against the interests of business - yet business people too often practice it. To the extent that market participants discriminate, they erect barriers to the free flow of capital and labor to their most profitable employment, and the distribution of output is distorted. In the end, costs are higher, less real output is produced, and national wealth accumulation is slowed. By removing the non-economic distortions that arise as a result of discrimination, we can generate higher returns to both human and physical capital. We are experiencing an extraordinary period of economic innovation. At the policy level, we must work to configure monetary policies that will foster a continuation of solid growth and low inflation. Beyond this primary mandate, we at the Federal Reserve are also responding to the challenge of ensuring that all communities can fully participate in our growing prosperity. With our Community Affairs program we provide information, instruction, and technical assistance to a diverse range of constituents regarding community reinvestment, community economic development, fair lending, and related issues. Our reach is broad: during 1999 more than 15,000 participants attended our conferences and seminars, and we responded to more than 800 requests for in-depth technical assistance. We are also increasing the research focus on topics related to community and economic development and in 2001 will host a second national conference, this one focusing on the theme of changing financial markets and community development. Your participation in, and support of, these activities is important because you play such a crucial role in helping communities respond to the evolving financial, educational, and technological demands of this new century. As I indicated in my opening remarks, future historians are likely to conclude that the past five years have been a pivotal period in American economic history. I trust they will also conclude that it was a period that set in place policies to foster the eventual emergence of full participation of that segment of the workforce that has not fully shared in our economic progress.
|
board of governors of the federal reserve system
| 2,000 | 3 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the White House Conference on the New Economy, Washington, D.C., held on 5 April 2000.
|
Alan Greenspan: Technological innovation and the US economy Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the White House Conference on the New Economy, Washington, D.C., held on 5 April 2000. * * * It has become increasingly difficult to deny that something profoundly different from the typical postwar business cycle has emerged in recent years. Not only has the expansion reached record length, but it has done so with far stronger-than-expected economic growth. Most remarkably, inflation has remained subdued in the face of labor markets tighter than any we have experienced in a generation. While there are various competing explanations for an economy that is in many respects without precedent in our annals, the most compelling appears to be the extraordinary surge in technological innovation that developed through the latter decades of the last century. In the early 1990s, with little advance notice, those innovations began to offer sharply higher prospective returns on investment than had prevailed in earlier decades. The first sign of the shift was the sharp rise in capital investment orders, especially for high-tech equipment, in 1993. This was unusual for a cyclical expansion because it occurred a full two years after the trough of the 1991 recession. By 1995, the investment boom had gathered momentum, suggesting that earlier expectations of elevated profitability had not been disappointed. In that year, with inflation falling, domestic operating profit margins started to rise, indicating that increases in unit costs were slowing. These developments signaled that productivity growth was probably beginning to move higher, even though official data, hobbled by statistical problems, failed to provide any confirmation. Now, five years later, there can be little doubt that not only has productivity growth picked up from its rather tepid pace during the preceding quarter century but that the growth rate has continued to rise, with scant evidence that it is about to crest. The acceleration of productivity stemming from the investment boom has held cost increases in check. Despite the surge in demand, unit labor costs over the past year have barely budged, and pricing power has remained well contained. Apparently, firms hesitate to raise prices for fear that their competitors will be able to wrest market share from them by employing new investments to produce at lower costs. Indeed, the increasing availability of labor-saving equipment and software, at declining relative prices and with improving delivery lead times, is arguably at the root of the loss of business pricing power in recent years. To be sure, marked increases in available global capacity and the deregulation of key industries have removed bottlenecks and increased the competitive supply response of many economies, especially ours, and these developments have been influential in suppressing price increases. It would be an exaggeration to imply that, whenever a potential cost increase emerges on the horizon, a capital investment is available to quell it. Yet the veritable explosion of spending on high-tech equipment and software, which has raised the growth of the capital stock dramatically over the past five years, could hardly have occurred without a large increase in the pool of profitable projects available to business planners. As our experience over the past century and more attests, such surges in prospective investment profitability carry with them consequences for interest rates, which ultimately are part of the process that balances saving and investment in a non-inflationary economy. In these circumstances, rising credit demand is almost always reflected in an increase in corporate borrowing costs and that has, indeed, been our recent experience, especially in longer-dated debt issues. Real interest rates on corporate bonds have risen more than a percentage point in the past couple of years. Home mortgage rates have risen comparably. The Federal Reserve has responded in a similar manner, by gradually raising the federal funds rate over the past year. Certainly, to have done otherwise - to have held the federal funds rate at last year’s level even as credit demands and market interest rates rose - would have required an inappropriately inflationary expansion of liquidity. It is difficult to imagine product price levels remaining tame over the longer haul had there been such an expansion of liquidity. In the event, of course, inflation has remained largely contained. To be sure, the tripling of crude oil prices has left its mark on “headline” inflation rates and inflicted considerable pain on some sectors of our economy. However, there is little evidence, at least to date, to suggest that oil price increases have started to embed themselves more broadly in the underlying cost structure of American business - that is, beyond the direct effects of the higher energy costs themselves. Nevertheless, despite the very recent declines in the price of oil, there are risks here that need to be monitored closely. Given the persistent strength of private credit demands, market interest rates would have risen even more were it not for the emergence of a sizable unified budget surplus following a long period of chronic deficits. More recently, the Administration and the Congress have wisely chosen to wall off the surplus in the social security trust fund and to allow it to pay down Treasury debt held by the public. This action will surely contribute to sustaining the rapid private capital formation we have experienced in recent years. I see no reason that productivity growth cannot remain elevated, or even increase further, to the undeniable benefit of American businesses and workers. Achieving this outcome, however, requires that imbalances do not arise to drive the expansion off course. Only a balanced prosperity can continue indefinitely; one that is not will eventually falter. A change in market interest rates is an important element of the balancing mechanism of a market economy. Some misalignments have arisen over the course of the expansion. Owing largely to the increased rate of return on capital and a sizable wealth effect, overall demand for goods and services for the past four years has been growing noticeably in excess of the enhanced growth in potential supply, defined as the sum of the growth in the working-age population and productivity. An increasing share of the goods and services required to meet this extra demand has been supplied by net imports, with the remainder the result of an increase in domestic production achieved by drawing down the pool of those we count as officially unemployed and those otherwise available for work. Short of a significant opening up of our borders to more immigration, an increase in employment beyond the growth of the working-age population is limited to what remains of our shrinking pool of available workers. Although the sum of the unemployed and those not in the labor force but who nonetheless are available for work is still about ten million, the level has been falling steadily. This year, the figure has been lower as a percentage of the population than at any time in the history of this series, which goes back to 1970. Should the pool of available workers continue to shrink, there is a point at which this safety valve for excess demand will effectively close, even in the face of accelerating productivity. We do not know where that point is, but presumably it would occur well before a full depletion of the pool of potential workers. When we reach that point, short of a repeal of the law of supply and demand, the scarcity of labor will almost surely induce a rise in hourly compensation gains that increasingly outpaces an even faster productivity growth - a condition that would cause unit costs to accelerate over time. Moreover, we do not know how long net imports and US external debt can rise before foreign investors become reluctant to continue to add to their portfolios of claims against the United States. At that point, the safety valve of net imports could narrow or close. It is conceivable that these two buffers can continue to absorb an excess growth of demand over potential supply for quite a while longer. However, the significant uncertainties surrounding these new economic forces counsel prudence. We need to be careful to keep inflationary pressures contained: The evidence that inflation inhibits economic growth and job creation is too credible to ignore. Consequently, maintaining an environment of low and stable inflation provides the greatest opportunity for the dramatic increases in structural productivity to show through fully into higher standards of living. In that regard, readings from financial markets, despite their recent upheavals, suggest that participants perceive the most likely outcome to be a gradual adjustment to more balanced non-inflationary growth. As I have argued previously, a substantial part of the excess growth of demand over potential supply owes to a wealth effect, induced by the rising asset prices that have accompanied the run-up in potential rates of return on new and existing capital. The rise in stock prices, as well as in the capital gains on homes, has created a marked increase in purchasing power without providing an equivalent and immediate expansion in the supply of goods and services. That expansion in supply will occur only over time. The persuasive evidence that the wealth effect is contributing to the risk of imbalances in our economy, however, does not imply that the most straightforward way to restore balance in financial and product markets is for monetary policy to target asset price levels. Leaving aside the deeper question of whether asset price targeting is an appropriate governmental function, there is little, if any, evidence that monetary policy aimed at achieving that goal would be successful. The risks of investing in equities come primarily from uncertainty about future earnings and about the rates at which those future earnings should be discounted, and much less from changes in overnight interest rates, the principal tool of the central bank. Consequently, even if we were to foster somewhat larger movements in short-term rates to address changes in stock prices, I doubt that investors’ perceptions of equity risks would be much affected and thus that equity prices would be meaningfully influenced. In short, monetary policy should focus on the broader economy and on pending inflationary or deflationary imbalances. Should changes in asset prices foster economic imbalances, as they appear to have done in recent years, it is the latter we need address, not asset prices. * * * In the economy overall, one result of the more-rapid pace of information technology innovation has been a visible acceleration of the process of “creative destruction”, a shifting of capital from failing technologies into those technologies at the cutting edge. The process of capital reallocation across the economy has been assisted by a significant unbundling of risks in capital markets made possible by the development of innovative financial products, many of which themselves owe their viability to advances in information technology. There are few, if any, indications in the marketplace that the reallocation process, pushed forward by financial markets, is slowing. While growth in companies’ projected earnings has been revised up almost continuously across many sectors of the economy in recent years, the gap in expected profit growth between technology firms and others has persistently widened. As a result, security analysts’ projected five-year growth of earnings for technology companies now stands nearly double that for the remaining S&P 500 firms. To the extent that there is an element of prescience in these expectations, it would reinforce the notion that technology synergies are still expanding and that expectations of productivity growth are still rising. There are many who argue, of course, that it is not prescience but wishful thinking. History will judge. * * * Before this revolution in information availability, most 20th-century business decision-making had been hampered by pervasive uncertainty. Owing to the paucity of timely knowledge of customers’ needs and of the location of inventories and materials flowing throughout complex production systems, businesses required substantial programmed redundancies to function effectively. Doubling up on materials and people was essential as backup to the inevitable misjudgments of the real-time state of play in a company. Decisions were made from information that was hours, days, or even weeks old. Accordingly, production planning required costly inventory safety stocks and backup teams of people to respond to the unanticipated and the misjudged. Clearly, the remarkable surge in the availability of more timely information in recent years has enabled business management to remove large swaths of inventory safety stocks and worker redundancies. That means fewer goods and worker hours are absorbed by activities that, while perceived as necessary insurance to sustain valued output, in the end produce nothing of value. These developments emphasize the essence of information technology - the expansion of knowledge and its obverse, the reduction of uncertainty. As a consequence, risk premiums that were associated with many forms of business activities have declined. In short, information technology raises output per hour in the total economy principally by reducing hours worked on activities needed to guard productive processes against the unknown and the unanticipated. Narrowing the uncertainties reduces the number of hours required to maintain any given level of production readiness. Because knowledge is essentially irreversible, much, if not most, of the recent gains in productivity appear permanent. Expanding e-commerce is expected to significantly augment this trend. Already major efforts have been announced in the auto industry to move purchasing operations to the internet. Similar developments are planned or are in operation in many other industries as well. It appears to be only a matter of time before the internet becomes a prime venue for the trillions of dollars of business-to-business commerce conducted every year. Not all technologies, information or otherwise, however, increase productivity - that is, output per hour - by reducing the inputs necessary to produce existing or related products. Some new technologies bring about new goods and services with above-average value added per workhour. The dramatic advances in biotechnology, for example, are significantly increasing a broad range of productivity-expanding efforts in areas from agriculture to medicine. Indeed, in our dynamic labor markets, the resources made redundant by better information are being drawn to the newer activities and newer products, many never before contemplated. The recent biotech innovations are most especially of this type, particularly the remarkable breadth of medical and pharmacological product development. One less welcome by-product of rapid economic and technological change that needs to be addressed is the insecurity felt by many workers. This stems, I suspect, from fear of job skill obsolescence. Despite the tightest labor markets in a generation, for example, more workers currently report in a prominent survey that they are fearful of losing their jobs than was reported in 1991, at the bottom of the last recession. The marked movement of capital from failing technologies to those at the cutting edge has quickened the pace at which job skills become obsolete. The completion of high school used to equip the average worker with sufficient knowledge and skills to last a lifetime. That is no longer true, as evidenced by community colleges being inundated with workers returning to school to acquire new skills and by on-the-job training being expanded and upgraded by a large proportion of American business. It is not enough to create a job market that has enabled those with few skills to finally be able to grasp the first rung of the ladder to achievement. More generally, we must ensure that our whole population receives an education that will allow full and continuing participation in this dynamic period of American economic history. * * * In summary, we appear to be in the midst of a period of rapid innovation that is bringing with it substantial and lasting benefits to our economy. But policymakers must be alert to the full range of possible outcomes. In the end, I do not believe we can go far wrong if we maintain a consistent, vigilant, non-inflationary monetary policy focused on achieving maximum sustainable economic growth, a fiscal policy that produces substantial saving to accommodate investment in productive capital, a trade policy that fosters international competition through broadened market access, and an education policy that ensures that all Americans can acquire the skills needed to participate in what may well be the most productive economy ever.
|
board of governors of the federal reserve system
| 2,000 | 4 |
Speech by Mr Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at Widener University, Chester, Pennsylvania on 6 April 2000.
|
Roger W Ferguson, Jr: Realism during times of opportunity Speech by Mr Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at Widener University, Chester, Pennsylvania on 6 April 2000. * * * It is a pleasure to be here at Widener University and to join the list of Federal Reserve officials who have addressed this distinguished audience. As you know, the current economic expansion is now the longest in our nation’s history. This is due, in part, to a combination of good fiscal and monetary policies. However, more than anything that we do in Washington, D.C., the economy’s strength is the direct result of the myriad private decisions made on a daily basis by you and all Americans. In a very real sense, you are the heroes of this story, and you deserve to be commended. Having said that, it is important to be mindful that now is the most important time for realism, prudence, and vigilance by both policymakers and the public at large. We should be mindful that generally good economic times can soften the impact of - and often mask totally - poor judgments. Eventually, however, those judgments will have detrimental consequences. Let me focus on three areas of realism that are required in this time of historic opportunity: the financial sector, individual decisions, and the international sphere. Realism in the financial sector As you know, at the end of last year, the Congress passed and the President signed into law a bill to modernize the financial industry of the United States. This law, the Gramm-Leach-Bliley Act, presents opportunities and challenges for the financial sector, which must be approached realistically and prudently. The most obvious opportunity for the financial sector now is one of ongoing consolidation and broadening - consolidation largely in continuing response to the end of legal constraints on geographical operations and broadening as financial institutions take advantage of the opportunities to expand lines of business offered by the act. The consolidation movement among banking organizations, of course, predates the passage of the most recent financial modernization law. In fact, it is reasonable to believe that the forces for consolidation and broadening were so strong that they provided an impetus for the repeal of Glass-Steagall, following a generation of effort not only by the Congress but also by financial institutions and regulators. Nevertheless, by enhancing certainty about what can be done and how it can be done, I believe that the financial modernization law will likely bring an increase in mergers among firms that had been specializing in different financial services. These mergers will be undertaken to take advantage of the perceived synergies and cost advantages imagined from such combinations. If synergies in back office operations or in delivery of service can be captured, such linkages might well provide a more efficient, and hence less costly, delivery of complementary services. The extent of consolidation and broadening remains in question, however, which gives rise to the second opportunity: the opportunity to deepen specialization. In a world of large, full-service providers, I think there will be demand for specialty providers. These niche players presumably will be smaller and potentially more competitively agile than the larger competitors. The ability to foresee the creation of important specialty competitors requires no more than an ability to generalize from other industries, such as retailing, in which consolidation has existed side by side with the emergence of successful specialty businesses. Another opportunity open to the financial sector is the continuation of the impressive trend toward globalization and international consolidation. Major overseas markets are becoming more open, although the degree of openness varies from market to market. “Big bangs” have occurred in major markets, and privatization is the norm in a number of others. Of course, wisely or not, some still demand “national champions”, and others worry about privatizing the “crown jewels”. Importantly, the sophistication, scale, and scope that firms are building domestically, both here and in Europe, translates easily into a global platform. Examples abound: not only the success of US securities firms in European and Asian markets but also European banks in the United States. Also, the emergence of the euro as a successful global currency, with the payments infrastructure, unified monetary policy, and converging fiscal policies that are associated with it, creates an attractive, large market. This is a market that US firms have found, and will continue to find, hospitable and in which European cross-border mergers will, no doubt, continue. And, of course, the deepening of technology capabilities in most financial institutions means that management on a global scale, particularly risk management, is now feasible as well as necessary. This dynamic presents many challenges. The most obvious is in achieving the benefits and promise of consolidation and broadening. As we have seen, mergers between banks do not all achieve the full promise originally envisioned. In some cases, post-merger integration skills are found wanting, as the challenge of providing seamless service while integrating disparate back offices and branch networks proves to be beyond the skills of management. In other cases, the dynamics of newly acquired businesses prove unpredictable and leave even experienced managers explaining revenue shortfalls and earnings disappointments. As we know, markets can be unforgiving of such surprises, and boards of directors often follow the market signal and punish the top management thought to be responsible for failure. An obvious difficulty is the inability to predict which merger will be successful, with the winners not just cost-cutters but those who know how to develop new sources of revenue. While in general the early experience in large, cross-industry consolidation appears to be successful, we have not yet had the test of a slowing economy. Until we have gone through a full business cycle, it is hard to know how strong the business case for integration truly is. Second, we must be cautious in assuming that more diversified and larger firms are inherently less risky. One of the ongoing challenges in the emerging world of high tech finance is risk management. The experiences of the last 2.5 years indicate that the speed of market movements, combined with the scale of financial endeavor, can lead to a rapid reversal of fortune for even the most sophisticated market participants. Models are inherently backward-looking, and even the best of them have not proven to be foolproof in sounding the alarm for newer risks. There is evidence that banking organizations, and probably financial institutions more generally, will use the benefits gained from diversification to increase the risk in the individual components of their portfolios. Indeed, some activities now permissible in financial organizations, such as merchant banking, have high average returns, but those returns mask a wide variance in result, with some outcomes quite detrimental to profits and potentially to organizational vitality. In practice, the results will differ from firm to firm, but appropriate disclosure and risk-management practices will become even more important. I am heartened, I might add, by what seems to be the fact that US bank risk-management skills paid large dividends - although clearly not avoiding all losses - in the Asian financial crisis a couple of years back. A major reality of financial institutions is that their businesses are ultimately built on understanding and trust by both retail consumers and wholesale counterparties. The third, and most ephemeral, challenge is to build scale and complexity and still maintain understanding and trust while protecting proprietary information. The debate about privacy that accompanied the modernization discussion in the Congress reflects the challenges and constraints that lie ahead. On the wholesale side the challenge is to reveal to the market enough about risk and performance to allow for full and accurate evaluation by counterparties without disclosing proprietary information. Future LTCMs will be expected by counterparties to be much less opaque. Similarly, the ongoing review of the role of publicly issued subordinated debt for large and complex organizations is another example of the expectation that such organizations will be held to a higher test of transparency in order to build counterparty understanding and trust. In addition to realism in considering the opportunities that the new law allows, bank managers must be prudent in managing and monitoring the performance of banks. There has been a recent decline in profit-growth expectations among equity analysts. The consensus view among analysts appears to be that the industry will experience earnings-per-share growth in a range of 10 to 12% over the next five years. This is a respectable growth but is somewhat lower than the growth experienced in the last five years. I hope that bank managers are keenly aware of the risk profiles of their companies and are not inclined to take additional risks to hit earnings targets. Banks are clearly trying to diversify their earnings streams. They will need to monitor carefully the performance of newer products developed and marketed during the 1990s in response to broad consumer needs. While we have enjoyed record expansion, the prospects for a business and an economic downturn must be factored into pricing decisions. Credit and underwriting decisions should take into account realistic downside sensitivity analysis. Prudence in individual investment and borrowing decisions However, financial institutions are not the only economic actors who need to maintain realistic expectations and to exercise prudence and caution during this period. Individuals must exercise ongoing vigilance in their personal financial behavior. In particular, individuals should recognize that in this era of technology-induced growth, high growth goes hand in hand with high uncertainty and, for newer companies, volatility in their financial performance. This means that accurately valuing a company in the high-growth industries is dauntingly complex. Therefore, individual investors are best advised to consider a range of scenarios, including not just the rosy outcome of possible success but also the very real one of potential failure. History clearly demonstrates that for every successful start-up the vast majority find success elusive. Individuals would also be well advised to consider a range of personal financial scenarios. Perhaps based on expectations of solid income growth, which we all hope will be borne out, households have increased their debt faster than their disposable personal income in every quarter over the past five years. Despite increased borrowing, however, the household debt service burden, as conventionally measured to include consumer and mortgage debt, remains below the levels reached in the 1980s. This burden has been held down in recent years by falling interest rates and a shift toward longer maturity mortgage debt. Nonetheless, even in good economic times it is prudent for households to be prepared for a range of outcomes, not just the most optimistic ones. Caution in the global economy While being cautious, let me not convey a pessimistic tone, because I believe that the four major forces currently driving the domestic economy could well provide the underpinning for a new era of prosperity in the global economy. The first of these forces is the creation of, and massive investment in, technology - particularly information and communication technologies. Technology is thought to have played an important role in the increase in productivity - the output of goods and services per hour of labor - that is currently providing momentum for the economy of the United States. The second major force is business deregulation. The removal of unnecessary government regulation started more than twenty years ago during the administration of President Gerald Ford but gathered momentum during the Carter years. Deregulation allowed businesses, indeed forced businesses, to focus more clearly on the competitive market place, with lessened constraints and increased flexibility. The third major force is more prudent fiscal policy. The latter part of the 1990s has been characterized by government surplus, which, many believe, has freed investment resources for private-sector investment. The final major change was the reduction of both actual inflation and the expectation of inflation as a necessary component of personal and business decision-making. This trend began during the early-1980s, and it has reached the point of fruition only in the past few years. Relatively stable prices have allowed businesses and households to plan their economic affairs with a general expectation that the value of investments will not be eroded through a pernicious increase in the general price level. Indeed, price level stability has reinforced the impetus provided by deregulation for businesses to manage their affairs with a priority on efficiency. These developments are not unique to the United States. While our nation was the first to achieve the full benefits of these forces, they have been at work globally as well. Software and capital goods embodying newer information and communications technology are a major export of the United States. Other nations have their own domestic equivalents of our Silicon Valley and Route 128, whether they are called Bangalore in India or Helsinki in Finland. We are probably ahead in experiencing the benefits of newer technologies, but other countries will certainly catch up. The other three factors, which are preconditions to achieving the benefits of technology, are showing signs of advancing outside the United States, although the pace differs from country to country. Most industrialized economies have debated, and are continuing to debate, the question of how to free businesses from unnecessary regulation and what governments can and should do to make labor markets more flexible. This issue is clearly the focus of much attention currently in Europe. A number of emerging-market economies have privatized state-owned enterprises and generally are reducing regulation. The United Kingdom and Japan put into place financial deregulation in the 1980s and earlier in the 1990s. Additionally, much of the industrialized world has governments following a path of smaller deficits and eventually smaller debt. The 1992 Maastricht Treaty, laying the groundwork for the unification of much of Europe into a single market with its own currency - the euro - is the most obvious but not the only example of this trend. Finally, the emerging consensus among politicians, policymakers, and the general public in many nations is that any benefits of inflation are at best ephemeral and that inflation ultimately is highly destructive. The efforts being made by countries that have experienced periods of inflation, such as Brazil and Argentina, to avoid a recurrence of those experiences is instructive in this regard. Inflation has been coming down in both industrialized and emerging-market economies during the 1990s. However, achieving sustained global growth requires certain improvements. This potential global prosperity demands sounder banking institutions in all countries, particularly those that are still heavily dependent on bank-based financial intermediation, and more stable financial systems, putting a special burden on supervisory and regulatory authorities to remain vigilant. Similarly, technology allows for a more intertwined financial system, which again requires discipline by both the private sector and the public sector to remain successful. Finally, a global economy built around higher levels of technology and greater competition in markets for goods, services, and labor input will nevertheless also include persons or regions who by fortune or skill are not fully prepared to participate in a world economy. Those on the outside of this highly productive economy, be they our fellow citizens or entire nations and regions, will require special consideration from the national and international authorities with responsibility for providing economic assistance. Conclusion In concluding, let me reiterate that the prosperity now experienced by the United States, and potentially to be shared by the rest of the world, is certainly a welcome development. It is clearly the goal of the Federal Reserve to follow policies that will help extend this prosperity for as long as possible. However, it is also important for the financial sector and other members of the private sector here in the United States, and for market participants, banks, and regulators in other countries, to remain vigilant if this expansion is to continue here and to spread globally. We are in the midst of a period of enormous opportunity, one that can be extended and strengthened if we are realistic, remaining mindful of our obligations to act responsibly, both individually and collectively. In this context, for managers in the financial sector acting responsibly includes recognizing that not all financial institutions can successfully consolidate or profitably take advantage of every new power. For individuals, personal financial decisions - both investment and borrowing - should take into consideration the possibility that the most optimistic expectations of corporate or personal financial success might not come true. Finally, nations seeking to replicate the growth experience of the recent past in the United States should recognize that the current expansion is built, in part, on an underpinning of a sound financial system, including both healthy institutions and well-functioning capital markets, as well as solid supervision and regulation. I hope that we all recognize these lessons so that our age of opportunity reaches its full potential.
|
board of governors of the federal reserve system
| 2,000 | 4 |
Speech by Mr Edward W Kelley, Jr, Member of the Board of Governors of the US Federal Reserve System, at the Cosmos Club, Washington, D.C., on 30 March 2000.
|
Edward W Kelley, Jr: Learning from experience - Y2K revisited Speech by Mr Edward W Kelley, Jr, Member of the Board of Governors of the US Federal Reserve System, at the Cosmos Club, Washington, D.C., on 30 March 2000. * * * Thank you very much for the high honor of addressing the Cosmos Club, particularly for the opportunity to address the topic of Y2K for a second time. I have no doubt it is highly unusual, if not unique, for a speaker to rise here to discuss a non-event. To our pleasant surprise, that is exactly what Y2K turned out to be. When I agreed many months before the millennium rollover to offer these remarks, I fully expected to be making a very different speech. On my last visit here in January 1999, my theme was that I believed our extensive preparations would allow us to avoid serious difficulty but that we could expect a series of hopefully minor inconveniences. I held that view throughout last year, and with that expectation, I thought that tonight I would be expounding upon how we managed to avoid any worldwide breakdowns, how we contained those threats that did arise, why other nations had more trouble than we did, and the lessons to be learned would have been gleaned from this mixed bag of results. Instead, of course, the danger was almost totally avoided, and consequently the questions for today are very different. Let me try to address the following issues. First, briefly, was there really such a big threat in the first place? Second, far more fundamentally, why did we do so well? The answers to this question are wide-ranging, and it is out of these answers that arise the implications for the future. And, finally, what are some of the short-term possibilities and long-term concerns that this experience suggests? First of all, one must begin a Y2K retrospective by dealing with the question of whether this really was a serious threat or was it simply vastly over-hyped. The short answer is, you bet it was a serious threat. No reputable computer expert that I know of disagrees with that, and I can verify this assessment from a very up-close and personal perspective. My window onto the rollover event was through the Communication and Control Center we set up at the Fed to monitor the situation in the financial sector. Our preparations were quite comprehensive. We were prepared to receive, and did receive, real time news of any problems, however minor, that arose in Federal Reserve operations or payment systems across the country or in any depository institutions. There are about 22,000 depository institutions in the United States, and not one got into serious trouble as a result of the rollover. The financial community can be very proud of that. To be sure, a few organizations overlooked fixes that were readily available, and a few found mistakes in their remediation work. But because the problems were so few, and also so minor, all were fixed before they could become meaningful disruptions. But the most interesting thing was to observe, even in these few minor occurrences, how much mischief could have been done had the problems not been caught quickly and dealt with effectively. And, further, to imagine the chaos if there had been many thousands of them. An example. We at the Fed examined approximately 90 million lines of our own code contained in thousands of programs and had to remediate approximately 10% of them. We had one system in one of our 12 Federal Reserve Districts that had a Y2K glitch show up in its immediate post-millennium operations. It was fixed in two hours but had already misallocated millions of dollars to the wrong banks. The errors were quickly identified and reversed, nobody was hurt or even inconvenienced, but the potential for mischief was huge in just that one incident. I am sure one can find war stories like that in many, many places. What if there had been many thousands of such instances, and malfunctions in the endless interconnections between systems had metastasized the errors faster than the technicians could address them? It would have been a nightmare. If preparing for Y2K had been ignored, that is exactly what would have happened. But it was not ignored, and the massive effort paid off in the event. No, there were not serious problems, either in the United States or around the world. Which leaves the most interesting question - why not? The answers, as best we can understand them so far, are quite instructive, and they vary a great deal from country to country. The extent of computerization among countries falls along a spectrum. Clearly alone at one end of the spectrum is the United States, which is by far the most electronically automated of societies. One moves along the spectrum to the other industrialized nations, then to those many countries progressing in varying degrees toward modernization, and finally at the other end is the seriously underdeveloped world. All came through well, but for very different reasons. In the United States, we turned in one of our finest national performances in rising to meet a crisis. In retrospect, we can now see that Y2K was made to order for a classic American “can do” response. It was a specific, clearly identifiable and definable threat with an immutable, moment-in-time deadline. It was massive in scope, required the efforts of millions of people and the expenditure of billions of dollars, but it could be, and was, dealt with. The challenges came in phases. First, it was necessary to achieve public understanding as to what was at stake. Next, to achieve the committed involvement of not only computer technicians (that was easy) but also organizational leadership throughout our economy. And finally, after a frighteningly slow start, there was a nearly frantic rush to complete all the work necessary before the deadline arrived. Those of us who had been deeply involved for a long time simply had no way of knowing on 31 December 1999, to what extent our country, let alone the rest of the world, was really ready. We now know that the American people had responded once again, as the kids say, “big time”. In moving along the computerization spectrum to identify how others avoided major problems, we first need to acknowledge that in a few nations there were other examples of massive American-style efforts, particularly in the last six to nine months of 1999. But, basically, the answers are to be found elsewhere. First of all, both here and around the world there were numerous instances of large and small organizations simply jettisoning old systems entirely and replacing them with new, highly efficient systems that were designed to be Y2K compliant from conception. This was just an acceleration of the replacement cycle, and while expensive in the short run, is now paying off in improved productivity. Second, for over two decades, much of the automation investment dollar in the United States went to purchasing brand-new capabilities, leaving older so called “legacy” systems in operation. A very high percentage of them utilized dates in some way, and those were virtually all Y2K flawed. Further, as they were being painstakingly reviewed for Y2K, it was realized they were very often slow, ineffective and poorly documented. It was an important and distressing revelation to American management to discover itself so dependent on inefficient and obsolescent equipment, and many are now moving rapidly to rectify this condition. Other nations, who began to automate more recently, never amassed the vast inventory of these older systems that required so much remedial work in our country. The newer systems, which many put in place over the last few years, were more standard and better documented, thus easier to check out and fix. In many cases, after a few operations checked out such a system and discovered it was either all right or could be easily fixed and how to do it, the word got around, and many others with the same installation were saved a lot of trouble. Then, in other places, only very new off-the-shelf software was being used. These were generally first efforts at automation in emerging areas. Most software vendors developed simple fixes for their products and put them up on their web sites for anyone to employ. You may have seen the report from Jamaica, where a very small group of knowledgeable technicians apparently fixed virtually every public and large private computer system in the entire nation in just a few months by simply following instructions available on the internet. It was, of course, overwhelmingly US companies that provided that free know-how. Finally, there are shockingly many nations that had little to do because there simply are very few computers there. As we sat in the Fed control room on 31 December and watched time zone after time zone roll through the millennium problem free, we got increasingly nervous. We better not have a problem here after all the very visible work and all the international preaching we had been doing for so long! That would be a major embarrassment! But all went well. What implications may this experience have for us? We are still much too close to the event to have adequately identified all that happened, let alone comprehensively assess its impact on the future. But let me offer a few speculations. First, in the short term. Many foreigners are discreetly, but audibly, crowing about how we in the United States were so obsessed and spent so much money on Y2K while they had equally good results and were much more relaxed about it. I am convinced that we will have the last laugh. As a result of that so-called obsession, I believe we have strengthened the underpinnings of our already impressive level of productivity improvement. Among larger companies and public institutions, there is a new awareness in senior management of the possibilities for improving their operations through more effective technology and systems management. An example would be those old, inefficient legacy systems we just had to rework. I would not expect to see them proliferate again, once they disappear. Among the millions of smaller businesses and not-for-profit organizations, there is a new awareness and appreciation for what computers can do and a new sense of confidence in their ability to capture these improvements. This new sensitivity should lead to a far more rapid assimilation of the information technology revolution than has been the case in similar technical breakthroughs in the past. For example, the development of steam power, and later the wide availability of electrical power, took many decades to work through into comprehensive utilization by the economies of their day. Computers, of course, were invented a half-century ago, but it is only quite recently that they have evolved into broadly useful tools. It is even more recently that they have begun to have a major impact on the everyday operation of our economy. At the end of the 1950s, there were about 2,000 computers up and running in the world. Forty years later, there are approximately 200 million, and they are, of course, vastly more powerful and far less expensive. This should now expand much more quickly as a result of the rapid and wide exposure information technology achieved through the run-up to the Y2K event. There has been a great deal of concern in our society over the past several years about an emerging “digital divide” between the more and the less advantaged elements of our society. Computer literacy, some fear, will further widen the already broad gulf between the earning power of rich and poor. I would hope that Y2K’s wide public awareness and involvement could provide us an opportunity to shrink that gap rather than see it open even more. Certainly, everyone now knows there is a strong demand for computer skills in the workplace today and that it will only enlarge. Therein lies a huge employment opportunity, offering good wages and working conditions, for qualified applicants. To be sure, as is the case in so many other opportunities for societal improvement, an improvement in the education system will be required for us to seize this chance to move a step closer to greater income parity. But computers are now very much on the minds of leaders in education reform. On the other hand, Y2K, by exposing the vast differences in computer sophistication among nations, demonstrates that we have yet another “digital divide” concern to contend with: the lack in many countries of infrastructure in information technology will provide one more obstacle to their ability to improve their competitiveness with the advanced nations. Finally, some long-term concerns. Certainly, there are endless benefits latent in the new technologies, but they have been widely touted, and reciting them is beyond the scope of these remarks. But let me mention a few emerging concerns that may need to be dealt with as time goes on. Earlier we mentioned that the nature of the Y2K challenge, with its very specific dimensions and very explicit deadline, was made to order for the American workforce to deal with. We passed that first major test of this new era with flying colors, but history may show that, huge and critical as it was, it was relatively easily handled compared with an emerging new genre of social challenge. Later technology-driven challenges may be just as profound but less obvious, less deadline driven, less “salable”, more amorphous, and more difficult to pin down. They may be less overtly technical and may play out more on national and international political and social policy stages rather than at the level of the firm. Problems of that sort can be exceedingly difficult to come to grips with. If care is not taken, many could fester unaddressed, with results ranging from suboptimal solutions arrived at ad hoc to out-and-out disasters of any of a number of configurations. Public infrastructures and cultural norms, here as well as around the world, must evolve to successfully accommodate a new marketplace involving many entirely new elements in law and social structure. Driven by technology, one can identify the outlines of future problems such as various ecological issues that involve many nations and require difficult levels of cooperation to successfully address. Personal and national security threats could be unleashed by high-school hackers, computer-competent criminals, or even massive computer assaults mounted by terrorists or rogue states. The prerogatives of sovereign nations will be challenged in such areas as cyber patent opportunities, taxation of cyber commerce, and intellectual property rights. Personal privacy rights are already on the front burner and could prove to be quite intractable as high-tech capabilities advance. Modern financial systems are becoming so technologically sophisticated that regulators around the world are struggling to ensure their safety and soundness. There are different ethical, legal and religious issues surrounding technologies, unlocking the secrets of the human genome. One could go on and on. We may look back on Y2K as a major watershed event of a new information technology era that first dawned only two or three decades ago. We have learned much from it and glimpsed in the mirror darkly how much more we must yet learn to cope with. We have vividly seen how complex and interdependent our economic affairs have become, and this new awareness is already beginning to pay off in higher levels of efficiency and effectiveness. But we have also seen the outlines of technology-driven challenges that will press in on us with increasing urgency and could prove to be very difficult to deal with effectively. The Cosmos Club, both through its distinguished membership and the program focus it plans for the future, will play an important role in the drama that will unfold. It has been a privilege for me to have a small part in the opening scene.
|
board of governors of the federal reserve system
| 2,000 | 4 |
Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the National Automated Clearinghouse Association Annual Meeting, held in Los Angeles, on 10 April 2000.
|
Alan Greenspan: Change in the US retail payment systems Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the National Automated Clearinghouse Association Annual Meeting, held in Los Angeles, on 10 April 2000. * * * It is a pleasure to be with you this morning to discuss the changes that are taking place in our retail payments system. Many of the individuals and institutions involved in these changes will be addressing this conference over the next three days. It seems clear that, as in many sectors of the economy, innovations in technology, changes in business practices, and effective competition are reinforcing one another and causing the pace of experimentation with new products and services to accelerate. Nonetheless, the payment systems of the United States present a paradox. Our systems and banking arrangements for handling large-value dollar payments are all electronic and have been for many years. Banking records, including those for loans and deposits, have been computerized since the 1960s. Securities markets also now rely on highly automated records and systems, born out of necessity following the paperwork crisis of the 1970s. Yet in transactions initiated by consumers, paper - currency and checks - remains the payment system of choice. There were sweeping predictions in the late 1960s and early 1970s that electronic payments would quickly replace paper in the nation’s commerce. In the wholesale financial markets, these predictions came true, as concerns about risk and efficiency led to the widespread adoption of electronic technologies in back offices of financial firms and in payment and settlement systems. Yet in the retail payments system, we have tended to underestimate the size of the hurdles confronting a shift away from paper. Indeed, the average consumer is exceptionally conservative and traditional when it comes to money, which has a profoundly important role in day-to-day living. To the vast majority of people, it represents the stored value of one’s previous efforts. To many, it is the embodiment of their life’s work. Tampering with money has always had profoundly political implications. Much of American politics of the late nineteenth century, for example, was about the gold standard and the free silver movement. William Jennings Bryan’s famous “Cross of Gold” speech during the presidential campaign of 1896 reflected the deep-seated views of money’s role in society and, even today, one can hear echoes of that debate in the public discourse about money. Our history vividly affirms that the average person is far more sensitive to what form our money - our store of value and medium of exchange - takes than we payments system specialists have readily understood. It took many generations for people to feel comfortable accepting paper in lieu of gold or silver. It is taking almost as long to convince them that holding money and making payments in ephemeral electronic form is as secure as using paper. There is, of course, more to the tenacity of paper than a deep psychological connection between money and tangible wealth. Paper instruments also are perceived to have a greater degree of privacy than electronic payments, although there have been experiments with electronic money and other instruments that would provide relatively high levels of privacy. But confidence in such arrangements may take quite awhile to emerge. Currency, and to a large degree checks, are currently perceived to offer significant advantages in privacy over electronic payment systems that entail centrally maintained databases with elaborate records of individual transactions. Perhaps an even more important dimension influencing our behavior regarding money and payments is convenience. Currency and checks do not require the users to travel to special locations, dial the number of a special machine, or maintain special equipment to originate payments. This is not to deny that automation has played an important role in reducing risks and increasing efficiency in handling currency and checks. Rather, the issue is that traditional paper instruments allow the users themselves, within a structured format, to have significant control over when, where, and how to make payments. Turning to the suppliers of payment instruments and services, we see that many are straddling two different worlds. The world of paper is well known and a major part of the business of traditional financial institutions. The world of electronic commerce is a new and growing part of business that is changing daily and operating on a different time scale. The phrase “Internet time” has now been added to our vocabulary. Behind this phrase is a serious observation that advances in information technology allow new ideas to be transformed into products and services much more rapidly than a few years ago, thus greatly speeding up product cycles. At the same time, new information technologies have broken down barriers between firms and stimulated very creative and competitive processes across the economy. Some traditional financial institutions have tended to view this process with concern. As many firms have driven to find new ways to supply financial and other kinds of information, along with transactions and accounting services, some have expressed concern that their traditional payment franchise is being eroded. This concern is another manifestation of the insecurity brought on by innovation and change. Many firms, including financial firms, have now opened channels of data communication with existing and potential customers and business partners through the Internet. In this world, particularly in retail commerce, payments by paper have been the exception, not the rule. Despite ongoing discussions about privacy and security in electronic commerce, credit cards have rapidly become the payment instrument of choice for consumers. Interestingly, there have been experiments with new payment systems analogous to private currency. To date, these products have not been widely successful, despite the fact that some have offered significant degrees of privacy and security. Instead, familiarity with and confidence in the credit card built up over more than half a century of use seem for now to have shaped behavior. Some suppliers have sought to deepen confidence by voluntarily expanding consumer protections. In a twist of history, even gold coins can now be purchased on line with a credit card. Experiments are also taking place to facilitate the use of debit cards in on-line transactions. The use of such instruments would clearly expand electronic payment capabilities over the Internet to those with bank accounts who do not hold credit cards. Experiments with technologies such as electronic money that do not even require bank accounts may yet find a role to play. New arrangements are also being tried that would mimic the flexibility of the check in making payments in diverse on-line transactions ranging from ad hoc person-to-person payments to routine business-to-business purchases. Regarding the older electronic payment systems such as the automated clearinghouse (ACH), both suppliers of payment services and the end users are continuing to look for new ways to build on the interbank processing efficiencies that these systems offer. One of the great ironies is that studies in the 1960s and 1970s led to recommendations that it would be more economical for society to build whole new electronic payment systems such as the ACH than to adopt check-truncation technologies. Although the ACH has been extremely effective for automating some types of transactions, it has not been as widely used as originally anticipated. One of the problems has apparently been the relative lack of flexible and low-cost interfaces with consumers and with business systems similar to those that have been built up around the check. Now, however, a range of experiments and businesses are building on the ACH, and potentially on other electronic payment networks. In a revival of the idea of check truncation, projects have gone forward to truncate checks at the point of sale, as well as at lockbox locations, and to substitute ACH payments. These projects seek to combine the benefits to users of the check with the processing efficiencies of electronic payment systems. One more set of very interesting experiments involves electronic bill presentment and bill payment. There are competing models of the way technology can be used to eliminate paper and save time in both the presentment and payment of consumer bills. Leading models draw heavily on the ACH as the electronic payment mechanism, creating a much more flexible interface for users with the ACH than has existed in the past. As we look forward, the Federal Reserve recognizes that whatever innovations develop, the check will likely be with us for many years. Americans still write about sixty eight billion checks a year, and the numbers are expected to grow. At the Federal Reserve, we continue to modernize our check-processing systems. We are testing new systems for truncating and electronically presenting checks, which include capturing and storing the image of checks and enabling institutions to make payment decisions in real time by accessing these images through the Internet. At the same time, we are working to strengthen the payments system by enhancing the long-term efficiency of our check and automated clearinghouse services. The Federal Reserve also clearly recognizes the need to foster innovation in the private sector and to help remove barriers to the development and adoption of new payment services for electronic and traditional commerce. As I have often said, to continue to be effective, government’s regulatory role must increasingly be focused on assuring that adequate risk management systems are in place in the private sector. As financial systems have become more complex, detailed rules and standards have become both more burdensome and less effective. If we wish to foster financial innovation, we must first be careful not to impose rules that inhibit it, and we must be especially watchful that we not unduly impede our increasingly broad electronic payments system. Thus, the private sector needs to play the pivotal role in determining what payment services consumers and businesses actually demand and in supplying those services. In a period of change and uncertainty there may be a temptation, and a desire by some market participants, to have the government step in and resolve the uncertainty, whether through standards, regulation, or other policies. In the case of electronic payment innovations, only consumers and merchants will ultimately determine what new products are successful in the marketplace. Government action can retard progress, but almost certainly cannot ensure it. One important role government can play, however, is to help identify and, where appropriate, help remove barriers to innovation. As part of our continuing efforts, the Federal Reserve established last summer the Payments System Development Committee. The Committee, led by the Board’s Vice Chairman Roger Ferguson and President Cathy Minehan of the Boston Federal Reserve Bank, will advise us on public policy issues relating to the strategic development of the retail payments system. An important objective of the Committee is to work with the private sector to identify specific barriers to improving the retail payments system, along with steps that the Federal Reserve could take to address these barriers. As you begin this three-day conference focusing on new developments in the payments system, I hope that you will approach your discussions with a sense of both history and of new opportunities. Centuries of experience have been distilled into our traditional forms of paper payments, and change has not always come quickly. Yet new technologies and new forms of business are engines for change. More fundamentally, the enthusiasm of our society for experiment and innovation reflects a strong sense of confidence about the future that began in the very early days of our country. I am confident that this past will be prologue.
|
board of governors of the federal reserve system
| 2,000 | 4 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Journal of Financial Services Research and the American Enterprise Institute Conference, in Honor of Anna Schwartz, Washington, on 14 April 2000.
|
Alan Greenspan: Technology and financial services Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Journal of Financial Services Research and the American Enterprise Institute Conference, in Honor of Anna Schwartz, Washington, on 14 April 2000. * * * I am pleased to participate in today’s conference and to join others in the justly deserved tributes to Anna Schwartz. She has been most interested, among a wide variety of other subjects, in the connections that economic and financial policies have to financial crises. As a consequence, I have decided to speak about technology and financial services, and particularly about risk management, issues that I have spent a good deal of time addressing in recent years. As a related matter, I will comment on supervision and regulation as we move into the 21st century, and of course, I shall find a way to touch on the role of central banks. Without doubt, the acceleration in technology that has produced such an extraordinary effect upon our economy in general has had a particularly profound impact in expanding the scope and utility of financial products over the last fifteen years. Information technology has made possible the creation, valuation, and exchange of complex financial products on a global basis heretofore envisioned only in our textbooks, and even that just in recent years. Derivatives are obviously the most evident of the many products that technology has inspired. But the substantial increase in our calculation capabilities has permitted the putting into place of a variety of other products and, most beneficially, new ways to unbundle risk. What is really quite extraordinary is that there is no sign that this process of acceleration in financial technology is approaching an end. We are moving at an exceptionally rapid pace, fueled not only by the enhanced mathematical applications produced by our ever-rising computing capabilities but also by our expanding telecommunications capabilities and the associated substantial broadening of our markets. All the new financial products that have been created in recent years contribute economic value by unbundling risks and reallocating them in a highly calibrated manner. The rising share of finance in the business output of the United States and other countries is a measure of the economic value added by the ability of these new instruments and techniques to enhance the process of wealth creation. The reason, of course, is that information is critical to the evaluation of risk. The less that is known about the current state of a market or a venture, the less the ability to project future outcomes and, hence, the more those potential outcomes will be discounted. Financial intermediation, although it cannot alter the underlying risk in holding direct claims on real assets, can redistribute risks in a manner that alters behavior. This redistribution of risk induces more investment in real assets, presumably engendering a higher standard of living. This occurs because financial intermediation facilitates diversification of risk and its redistribution among people with different attitudes toward risk. Any mechanism that shifts risk from those who choose to withdraw from it to those more willing to take it on increases investment without significantly raising the perceived degree of discomfort from risk borne by the public. By itself, more abundant real-time information should both reduce the uncertainties and lower the variances employed to guide portfolio decisions. At least part of the observed fall in equity premiums in our economy and others over the past five years may have resulted from a permanent technology-driven increase in information availability, which by definition reduces uncertainty and therefore risk premiums. And because knowledge once gained is irreversible, so too are the lowered risk premiums. But while financial intermediation, through its impetus to diversification, can lower the risks of holding claims on real assets, it cannot alter the more deep-seated uncertainties inherent in the human evaluation process. There is little in our historical annals that suggests that human nature has changed much over the generations. But, as I have noted previously, while time preference may appear to be relatively stable over history, perceptions of risk and uncertainty, which couple with time preference to create discount factors, obviously vary widely, as does liquidity preference, itself a function of uncertainty. These uncertainties are an underlying source of risk that are too often regarded as background noise and are generally not captured in our risk models. I have previously called attention to changing risk perceptions as a risk-management challenge in a different context when discussing the roots of the recent international financial crises. My focus has been on the perils of risk management when periodic crises - characterized by sharply rising risk premiums - undermine risk-management structures that fail to address them. During a financial crisis, risk aversion rises dramatically, and deliberate trading strategies are replaced by rising fear-induced disengagement from market activity. It is the general human experience that when confronted with uncertainty, whether in financial markets or in any other aspect of life, disengagement is the normal protective reaction. In markets that are net long, the most general case, disengagement brings falling prices. In the more extreme manifestation, the inability or unwillingness to differentiate among degrees of risk drives trading strategies to seek ever-more-liquid instruments that presumably would permit investors immediately to reverse decisions at minimum cost should that be required. As a consequence, even among riskless assets, such as US Treasury securities, liquidity premiums rise sharply as investors seek the heavily traded “on-the-run” issues - a behavior that was so evident in the fall of 1998. While we can readily describe the process of sharp reversals in confidence, to date economists have been unable to anticipate it. Nevertheless, if episodic recurrences of ruptured confidence are integral to the way our economy and our financial markets work now and in the future, the implications for risk measurement and risk management are significant. Probability distributions estimated largely, or exclusively, over cycles that do not include periods of panic will underestimate the likelihood of extreme price movements because they fail to capture a secondary peak associated with extreme negative outcomes. Furthermore, joint distributions estimated over periods that do not include panics will underestimate correlations between asset returns during panics. Under these circumstances, fear and hence disengagement on the part of investors holding net long positions often lead to simultaneous declines in the values of private obligations, as investors no longer materially differentiate among degrees of risk and liquidity, and to increases in the values of riskless government securities. Consequently, the benefits of portfolio diversification will tend to be overestimated when the rare panic periods are not taken into account. The uncertainties inherent in valuations of assets and the potential for abrupt changes in perceptions of those uncertainties clearly must be adjudged by risk managers at banks and other financial intermediaries. At a minimum, risk managers need to stress test the assumptions underlying their models and consider portfolio dynamics under a variety of alternative scenarios. The outcome of this process may well be the recommendation to set aside somewhat higher contingency resources reserves or capital - to cover the losses that will inevitably emerge from time to time when investors suffer a loss of confidence. These reserves will appear almost all the time to be a suboptimal use of capital, but so do fire insurance premiums - until there is a fire. More important, boards of directors, senior managers, and supervisory authorities of financial institutions need to balance emphasis on risk models that essentially have only dimly perceived sampling characteristics with emphasis on the skills, experience, and judgment of the people who have to apply those models. Being able to judge which structural model best describes the forces driving asset pricing in any particular period is itself priceless. To paraphrase my former colleague, Jerry Corrigan, the advent of sophisticated risk models has not made people with gray hair, or none, wholly obsolete. More fundamentally, technology may be affecting the underlying economics of financial intermediation. One of the profound effects of technology on financial services is that the increasing availability of accurate and relevant real-time information, by reducing uncertainty, reduces the cost of capital. That is to say, the cost of capital is lower for both lenders and borrowers and for banks in their role as both. It is important to a bank as a borrower because funding costs are critically tied to the perceived level of uncertainty surrounding the institution’s condition. It is important in the role of lender because a decline in uncertainty resulting from a substantial increase in real-time information implies a reduction in what might be called “knowledge float” - the ability to maintain proprietary information and earn a rate of return from that information with no cost. As you know, financial intermediaries historically have been successful not only because they diversified to manage risk but also because they possessed information that others did not have. This asymmetry of information was capitalized at a fairly significant rate. But that advantage now is rapidly dissipating. We are going to real-time systems, not only with transactions but with knowledge as well. Financial institutions can respond to this disappearing advantage by endeavoring to preserve the old way of doing business - by keeping information, especially adverse information, away from the funders of their liabilities. But that, I submit, is a foolish policy that buys a dubious short-term gain with a substantial long-term cost. Moreover, inevitably and increasingly it will become more difficult to do. Whenever it becomes clear that the information coming out of an institution is somehow questionable, that institution will pay an uncertainty premium. Conversely, when companies write off errors, their stock prices almost invariably rise. The reason is the removal of uncertainty and the elimination of a shadow on the company’s credibility. What does all this mean for financial supervision and regulation? If the supervisory system is to remain effective in fostering the safety and soundness of the country’s financial system, it must adjust to the changing structure of that system. When wearing our supervisory hat at the Federal Reserve, we and our sister agencies are always working to move in a manner that facilitates and fosters innovation. We are in a dynamic system that requires not just us but our colleagues around the world to adjust as well. Today’s financial products and rapidly changing structures of finance mean the old-fashioned, 19th and 20th century presumption that a month-old balance sheet is telling us all we need to know about an institution’s current condition is long since gone. Inevitably, therefore, we as supervisors are recognizing this reality and have been placing greater emphasis on how well internal risk models are functioning and whether the risk thus measured is being appropriately managed and offset with reasonable hedges. We are also scrutinizing how well an institution is able to tie its risk exposures to internal capital needs. We have a long way to go, but this is where competitive pressures and the underlying economic forces are pushing both financial intermediaries and the supervisory system. There is a broader and more difficult problem of risk management that central bankers confront every day, whether we explicitly acknowledge it or not: how much of the underlying risk in a financial system should be shouldered by banks and other financial institutions. Clearly, were we to require that bank risk-management systems, for example, provide capital to address all conceivable risks that could bring failure, the rates of return on capital would fall, and the degree of financial intermediation and leverage, as a consequence, would inevitably decline. The degree of leverage in financial systems is obviously tied to the degree of risk at the margin of lending. Before the creation of the Federal Reserve and, later, deposit insurance, banks were forced by the marketplace to hold 20 percent and more of their assets as capital if they wanted to sell their liabilities at minimum interest costs. By its actions in the marketplace and its chosen governmental structure, society reveals its preference for trading off leverage with its underlying risks and economic growth. Few, I presume, would argue that zero leverage is optimum. Fewer would argue that zero leverage is consistent with maximum growth. Yet the dangers of too much leverage are all too evident. In this context, how do we central bankers and other supervisors read our very amorphous directive to maintain financial stability and economic growth? We have all chosen implicitly, if not in a more overt fashion, to set our capital and other reserve standards for banks to guard against outcomes that exclude those once or twice in a century crises that threaten the stability of our domestic and international financial systems. I do not believe any central bank explicitly makes this calculation. But we have chosen capital standards that by any stretch of the imagination cannot protect against all potential adverse loss outcomes. There is implicit in this exercise the admission that, in certain episodes, problems at commercial banks and other financial institutions, when their risk-management systems prove inadequate, will be handled by central banks. At the same time, society on the whole should require that we set this bar very high. Hundred-year floods come only once every hundred years. Financial institutions should expect to look to the central bank only in extremely rare situations. I am obviously referring to far more adverse outcomes than I was alluding to in my earlier remarks on the need for private risk-management systems to adjust for crises in their estimates of risk distributions. However, where that dividing line rests is an issue that has not yet been addressed by the international banking community. Clearly, to choose the distribution of risk-bearing between private finance and government is to choose the degree of moral hazard. I believe we recognize and accept it. Indeed, making that choice may be the essence of central banking. In summary, then, although information technology by its very nature has lowered risk, it has also engendered a far more complex international financial system that will doubtless bedevil central bankers and other financial regulators for decades to come. I am sure that nostalgia for the relative automaticity of the gold standard will rise among those of us engaged to replace it.
|
board of governors of the federal reserve system
| 2,000 | 4 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the 36th Annual Conference on Bank Structure and Competition of the Federal Reserve Bank of Chicago, Chicago, on 4 May 2000.
|
Alan Greenspan: Banking evolution Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the 36th Annual Conference on Bank Structure and Competition of the Federal Reserve Bank of Chicago, Chicago, on 4 May 2000. * * * The final decades of the 20th century witnessed remarkable advances in financial engineering, financial innovation and deregulation. As recently as 35 years ago, the universe of financial instruments was composed almost exclusively of deposits, short- and long-term plain vanilla debt and equities. Financial institutions, by and large, specialized in relatively narrow portions of these markets. In the intervening years, significant developments in technology and in the pricing of assets have enabled innovations in financial instruments that allow risks to be separated and reallocated to the parties most willing and able to bear them and the degree of specialization by financial intermediaries changed dramatically. In the case of debt instruments, investors may now choose among structured notes, syndicated loans, coupon STRIPs and bonds secured by pools of other debt instruments. But of all the changes we have observed in the past three decades, two of the most dramatic have been the growing use of financial derivatives and the increasing presence of banks in private equity markets. Today I should like to evaluate the scope of these latter progressions, the risks they entail and some of the challenges in managing those risks. It seems undeniable that in recent years the rate of financial innovation has quickened. Many in fact argue that the pace of innovation will increase yet further in the next few years as financial markets increasingly intertwine and facilitate the integration of the new technologies into the world economy. As we stand at the dawn of the 21st century, the possible configurations of products and services offered by financial institutions appear limitless. There can be little doubt that these evolving changes in the financial landscape are providing net benefits for the large majority of the American people. The rising share of financial services in the nation’s national income in recent years is a measure of the contribution of the newer financial innovations to America’s accelerated economic growth. Derivatives and private equities have been in the forefront of the recent financial expansion, fostering the financing of a wider range of activities more efficiently and with improved management and control of the associated risks. Fear of change Nonetheless, some find these developments worrisome or even deeply troubling. The rapid growth and increasing importance of derivative instruments in the risk profile of many large banks has been a particular concern. Yet large losses on over-the-counter derivatives have been few. Derivatives possibly intensified the losses in underlying markets in the liquidity crisis during the third quarter of 1998, but they were scarcely the major players. Credit losses on derivatives spiked but remained well below those experienced on banks’ loan portfolios in that episode. Derivatives credit exposures, as you all know, are quite small relative to credit exposures in traditional assets, such as banks’ loans. In the fourth quarter of last year, for example, banks charged off $141 million of credit losses from derivatives - including options, swaps, futures and forwards - or only 0.04% of their total credit exposure from derivatives. This in part reflects the fact that in some derivative contracts, most notably in interest rate swaps, there is no principal to be exchanged and thus no principal at risk. In comparison, net charge-offs relative to loans were 0.58% in that quarter - also small but, nonetheless, almost 15 times as much. In the third quarter of 1998, at the height of the recent financial turmoil, the loan charge-off rate at US banks was four and a half times that of derivatives. In a similar vein, concerns of highly leveraged positions caused by derivatives have led to fears of “excessive leverage”. But leverage, at least as traditionally measured, is not a particularly useful concept for gauging risk from derivatives. A firm might acquire an interest rate cap, for example, to hedge future interest rate uncertainty and hence to reduce its risk profile. Yet if the cap is financed through debt, measured leverage increases. Thus, although one may harbor concerns about the overall capital adequacy of banks and other participants in derivatives markets and their degree of leverage, the advent of derivatives appears to make measures of leverage more difficult to interpret but not necessarily more risky. To be sure, the unfamiliar complexity of some new financial instruments and new activities, or the extent to which they facilitate other kinds of risk-taking, cannot be readily dismissed even by those of us who view the remarkable expansion of finance in recent years as a significant net benefit. What I suspect gives particular comfort to those of us most involved with the heightened complexity of modern finance is the impressive role private market discipline plays in these markets. Importantly, derivatives dealers have found that they must maintain strong credit ratings to participate in the market. Participants are simply unwilling to accept counterparty credit exposures to those with low ratings. Besides requiring a strong capital base and high credit ratings, counterparties in recent years have increasingly insisted both on netting of exposures and on daily posting of collateral against credit exposures. US dealers, in particular, have rapidly expanded their use of collateral to mitigate counterparty credit risks. In these programs, counterparties typically agree that, if exposures change over time and one party comes to represent a credit risk to the other, the party posing the credit risk will post collateral to cover some (or all) of the exposure. These programs offer market participants a powerful tool for helping control credit risk, although their use does, as we all know, pose significant legal and operational issues. Legitimate concerns Despite the commendable historical loss record and effective market discipline, there are undoubtedly legitimate concerns and avenues for significant improvement of risk management practices. Moreover, during the recent phenomenal growth of the derivatives market, no significant downturn has occurred in the overall economy to test the resilience of derivatives markets and participants’ tools for managing risk. The possibility that market participants are developing a degree of complacency or a feeling that technology has inoculated them against market turbulence is admittedly somewhat disquieting. Such complacency is not justified. In estimating necessary levels of risk capital, the primary concern should be to address those disturbances that occasionally do stress institutional solvency - the negative tail of the loss distribution that is so central to modern risk management. As such, the incorporation of stress scenarios into formal risk modeling would seem to be of first-order importance. However, the incipient art of stress testing has yet to find formalization and uniformity across banks and securities dealers. At present most banks pick a small number of ad hoc scenarios as their stress tests. And although the results of the stress tests may be given to management, they are, to my knowledge, never entered into the formal risk modeling process. Additional concern derives from the fact that some forms of risk that we understand to be important, such as liquidity and operational risk, cannot at present be precisely quantified, and some participants do not quantify them at all, effectively assuming them to be zero. Similarly, the present practice of modeling market risk separately from credit risk, a simplification made for expediency, is certainly questionable in times of extraordinary market stress. Under extreme conditions, discontinuous jumps in market valuations raise the specter of insolvency, and market risk becomes indistinct from credit risk. Of course, at root, effective risk management lies in evaluating the risk models upon which capital allocations and economic decisions are made. Regardless of the resources and effort a bank puts into forecasting its risk profile, it ought not make crucial capital allocation decisions based on those forecasts until their accuracy has been appraised. Yet forecast evaluation, or “backtesting”, procedures to date have received surprisingly little attention in both academic circles and private industry. Quite apart from complacency over risk-modeling systems, we must be careful not to foster an expectation that policymakers will ultimately solve all serious potential problems and disruptions. Such a conviction could lull financial institutions into believing that all severe episodes will be handled by their central bank and hence that their own risk-management systems need not be relied upon. Thus, overreliance on public policy could lead to destabilizing behavior by market participants that would not otherwise be observed - what economists call moral hazard. There are many that hold the misperception that some American financial institutions are too big to fail. I can certainly envision that in times of crisis the financial implosion of a large intermediary could exacerbate the situation. Accordingly, the monetary and supervisory authorities would doubtless endeavor to manage an orderly liquidation of the failed entity, including the unwinding of its positions. But shareholders would not be protected, and I would anticipate appropriate discounts or “haircuts” for other than federally guaranteed liabilities. As we consider potential shortcomings in risk management against the backdrop of an absence of significant credit losses in derivatives, one is compelled to ask: has the financial system become more stable, or has it simply not been tested? Probability distributions estimated largely, or exclusively, over cycles that do not include periods of financial stress will underestimate the likelihood of extreme price movements because they fail to capture a secondary peak at the extreme negative tail that reflects the probability of the occurrence of extreme losses. Further, because the experience during crises indicates heightened correlations of price movements, joint distributions estimated over periods that do not include severe turbulence would inaccurately estimate correlations between asset returns during such episodes. The benefits of diversification will accordingly be overestimated. Another aspect of the system that may not have been appropriately tested is the set of credit risk modeling systems that have evolved alongside the growth in derivatives. Such models embody procedures for gauging potential future exposure. Prevailing prices will doubtless change in the future, so counterparties must assess whether those contracts with small or even negative current values now have the potential to result in large positive market values and, hence, a potential credit loss on default. Do such calculations adequately account for the possibility of prolonged disruptions or recessions? Are assumptions relating exposures to default probabilities sufficiently inclusive? These and other support columns underlying estimation of potential future exposure should continue to be examined under a critical light. Private equity activity Derivatives, no doubt reflecting their growth, their extensive use in hedging that facilitates additional risk-taking and their gigantic notional values, continue to be the quintessential image of financial engineering and innovation. But another dramatic change in the activities of banking organizations has received less attention: merchant banking. Indeed, the most dramatic change in the financial landscape that the Gramm-Leach-Bliley Act may have induced is not the combination of banking, securities underwriting and insurance, but rather the generalized merchant banking powers for financial holding companies. And even this change is really evolutionary for a handful of very large US banking organizations. By merchant banking, I mean financial equity investment in non-financial firms, most often, but not always, in non-public companies, with the investor providing both capital and financial expertise to the portfolio company. Such investments are usually held for three to five, but often as long as 10 or more, years for subsequent resale to other investors. The recent financial modernization legislation gives banking organizations broad authority to make merchant banking investments but prohibits them from routinely managing the portfolio companies in which they have invested except in extraordinary circumstances for limited periods. In addition, banks’ credit extensions to the firms in which their parents or affiliates hold equity are limited by the same section 23 A and B restrictions imposed on bank lending to their affiliates. Prior to the recent legislation, banking organizations could make only limited types of merchant banking investments, and these were made principally through three vehicles. First, since the late 1950s, banks and bank holding companies have been authorized to operate small business investment companies (SBICs) that can invest in up to half of the equity of an individual small business, currently defined by regulation as one with less than about $20 million of pre-investment capital. The aggregate limit of such investments cannot exceed 5% of the bank or BHC’s capital. Second, Edge corporations, which are primarily subsidiaries of banks but can also be subsidiaries of holding companies, can acquire up to 20% of the voting equity and 40% of the total equity of non-financial companies outside the United States. Finally, BHCs more generally can acquire up to 5% of the voting shares and up to 25% of the total equity of any company without aggregate limit. I have, of course, been referring to equity investments of banking organizations for their own account. BHC’s section 20s - and any future investment banking affiliates - also hold equities in trading accounts as part of their underwriting and trading activities. These daily mark-to-market holdings are quite large at a couple of banking organizations that have a significant equity underwriting business but are rather modest for others. Through the three long-term holding vehicles, banking organizations have made direct equity investments on their own and in partnership with others. They have also made indirect investments through private investment groups, sometimes acting as the manager of the group for performance-based fees. In the early 1960s, banking organizations were probably the dominant source of venture capital in the United States and still play an important role - perhaps accounting currently for 10 to 15% of the domestic private equity market. What has changed with the recent legislation is the generalized grant of authority for bank holding companies that qualify as financial holding companies to exercise merchant banking powers. There are now about 155 domestic and more than 10 foreign financial holding companies that could - but not necessarily will - undertake merchant banking. Two thirds of the financial holding companies have less than $500 million in assets, about one third have less than $150 million. In evaluating that general grant of merchant banking authority, it is useful to consider the experience of banking organizations that have been active participants in the private equity market in recent decades. To date, there have been no significant problems. To be sure, the record on private equity investment by banks is one of substantial year-to-year variation in return, just as one might expect with any portfolio of risky assets. Some of the deals have resulted in total write-offs, but over all the rates of return, especially in recent years, have been quite impressive - 30% or so per year in the last five years. In part, perhaps in large part, this reflects the substantial rise in equity prices. Still another historical factor has been the quite conservative treatment of equity portfolios by banking organizations. Both banks and independent securities firms engaging in merchant banking have tended to allocate substantial internal capital to support their private equity investment activity - between 50 and 100% - and to recognize unrealized capital gains only on traded equities or when some triggering event supported the revaluation of non-traded shares and then only subject to a discount. In effect, banks have locked up significant internal capital for their equity purchases and have been conservative in recognizing gains in their earning flows and, consequently, in their capital. For a small number of large banking organizations, equity portfolios are a significant share of their business already. As of year-end 1999, for the five large banking organizations with more than one billion dollars invested, at cost, in equities, these assets accounted for between approximately 10% and 25% and more of Tier 1 capital and between more than 10% and 35% at carrying value. Moreover, the pre-tax gains recognized last year - either at sale or because of revaluation - accounted for between 5 and 30% of pre-tax reported earnings in 1999 at these five banking organizations. In the first quarter of this year, such gains accounted for 16% to more than one half of pre-tax income. It is likely that authorization of merchant banking powers will lead both to deeper participation by the current large players and to wider merchant banking activity across banking organizations. To limit risks to the bank subsidiary of the financial holding companies and to the insurance fund, the Federal Reserve interim regulations require that before this activity commences, the organizations establish appropriate internal controls to manage the risks associated with this activity. It must be kept in mind, as I pointed out in other contexts, that most bad commercial loans are made during prolonged periods of prosperity. I suspect that the experience of bank equity investment has been similar. Current interim regulations - which propose for comment a 50% capital charge on all non-trade account equities held by banking organizations - should not be viewed separately from the current state of the economy any more than commercial banking should be. In any event, at those entities with significant merchant banking portfolios, the above average variance in stock prices will doubtless add to the variability of earnings of the overall organization - and hence, one can conclude, to the organization’s valuation in the marketplace. There is, indeed, general agreement that the price-earnings ratio of trading banks is lower than that of other banks of the same size, although it has been difficult because of the dynamics of other variables to nail down empirically the appropriate orders of magnitude. And, I suspect, that if the data were readily available, we might be able to demonstrate the same pattern at institutions significantly involved in the private equity market and perhaps even in derivatives trading. Any earnings stream that shows variability has been appropriately discounted. That is not to say that real economic value is not being created for banking organizations, their shareholders and the economy from what appears to be a greater - and perhaps expanding - flow of venture and other equity capital from banking organizations. But despite the very good record to date in both the derivatives and private equity activities of banking organizations, we all would be remiss if we did not note that there are risks in these activities that, during some periods in the future, will create reduced returns, if not significant overall losses, for individual organizations. However, the same might be said about portfolios of loans - the traditional historical major asset of banks - and one that will continue to dominate the business of most banks for the foreseeable future. Conclusion I have noted many times over the years that the purpose of banks and banking organizations is to take risk in order to contribute to, and facilitate the growth, and other needs, of an economy. We must be cautious, however, that we understand the nature of the new risks that have evolved with information innovation technologies and be certain that they are managed in ways that do not undermine this economic role. Balancing these objectives is no easy task. We need to ensure that strong risk-management systems are in place and that the management of banking organizations use these systems both to enhance their awareness and understanding of the risks knowingly taken and to manage those risks accordingly. But systems are never perfect, mistakes will be made and tails in loss distributions do represent a reality that sooner or later occurs. Individual foreign and domestic banking organizations in the past have, from time to time, suffered large losses in the derivatives and private equity markets. We will not be immune from such events in the future. But so long as we recognize the risks and insist on good risk-management system, and so long as supervision moves - as it has - from balance sheet analysis to a review, evaluation and criticism of risk management systems, economic growth is, I suggest, enhanced by the kinds of financial innovation that technology and deregulation are now producing.
|
board of governors of the federal reserve system
| 2,000 | 5 |
Remarks by Mr Roger W Ferguson Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the 36th Annual Conference on Bank Structure and Competition, Chicago, on 4 May 2000.
|
Roger W Ferguson Jr: Electronic commerce, banking and payments Remarks by Mr Roger W Ferguson Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the 36th Annual Conference on Bank Structure and Competition, Chicago, on 4 May 2000. * * * Over the past few years the topic of electronic commerce and banking has moved from the laboratory into the mainstream of our public discourse. This afternoon our panel has been asked to discuss the subject of alternative financial delivery systems. I would like to broaden this discussion a bit. The theme of my remarks will be that three types of variables - convenience, confidence, and complexity are helping to shape the ongoing changes in electronic commerce and banking. I would then like to apply this theme briefly to the historical development of retail payment systems, in order to provide some insights into the changes in products and delivery systems that are now taking place. Finally, I would like to touch on the role of the central bank in addressing these developments and provide some information on the work of the Federal Reserve’s Payments System Development Committee. Convenience, confidence, and complexity Electronic commerce is growing rapidly as our technologies for processing, analyzing, and transmitting vast quantities of data continue their extraordinary development. Consumer and business practices across a number of markets are changing, in some cases dramatically. In the end, there may be far-reaching and positive implications for the structure and efficiency of many of our markets. Three key sets of variables that are shaping electronic transactions and electronic commerce can be summarized under the headings of convenience, confidence, and complexity. Convenience refers to the capital, labor, time, and other real resources needed to conduct a transaction. Obviously, consumers and businesses wish to optimize the resources expended in conducting a transaction. Confidence refers to the trust that parties have in the elements of a transaction that generate risk to them. Financial, operational, security, and legal risks are relevant here as in many other contexts. Particular attention is currently being paid to the complex of “trust variables” relating to the authentication of transactions and parties, as well as to issues of privacy. Complexity is a shorthand reference to the ease with which the key features of a transaction can be standardized and automated and, ultimately, understood by the parties to the transaction. As we have now learned, however, it is not the good or service itself, sometimes called content, that necessarily has to be standardized in order to participate in electronic commerce. Rather it is the sales transaction and key related services that need to be standardized. There appear to be important tradeoffs among the convenience, confidence, and complexity variables that shape choices about electronic as well as other transactions. Greater convenience in transacting through open data communication networks, for example, may increase security and privacy risks and reduce user confidence. Greater complexity, in turn, may reduce the convenience of transacting electronically through data networks. One fundamental point, however, is that ongoing changes in technology are improving the terms of these tradeoffs, sometimes in several dimensions at the same time. For example, traditional constraints on the timing and location of economic transactions are being relaxed simultaneously and rapidly in a number of markets, leading to large potential gains in convenience. Through the use of the Internet and automated business systems, many markets can now be open twenty-four hours per day at very low marginal costs. Transactions can take place at much more convenient times tailored to the specific needs of individuals and businesses, with either immediate delivery of services in some cases or with later delivery in others. New technologies are also reducing the need for buyers and sellers to meet at one location as well as the need for computers and telephones to be tied to traditional wire networks. The result is that even some traditional retail markets increasingly seem ubiquitous and global. In addition, as convenience factors such as time and location are changing, significant efforts are being made to strengthen confidence in electronic transactions. Various encryption systems have been deployed. Developments in public key infrastructure are being closely followed. Considerable attention continues to be paid to strengthening the law governing electronic transactions. And privacy has reemerged as a crucial commercial and legal issue. Electronic banking Electronic commerce involving banks is subject to the same forces as those affecting many other industries. New communications channels and devices, coupled with automated systems, allow a bank and its customers to transact an expanding range of business at virtually any time. According to recent statistics, nearly 40% of all US banks now provide some form of web site through which they can communicate with customers, and nearly 15% provide web sites that can be used to conduct banking transactions. These numbers are growing rapidly. Of the banks with more than $500 million in assets, nearly 50% now provide web sites that can be used to conduct transactions. In parallel with the development of new transaction systems, there is an ongoing trend toward standardizing and automating banking products, including loan products, which traditionally required special attention and approvals along with thick files of documents. As in other industries, this combination of developments is calling into question the size of investments in traditional delivery mechanisms, which are now disparaged as “legacy systems” and “brick and mortar” investments. As in other industries, banks are increasingly examining both the relative importance of their various delivery channels and the degree to which their products and services are integrated across the channels. In this environment, banks are continuing to experiment with new technologies, services, and business models. It is natural that there is both uncertainty and intense market competition surrounding promising innovations. Because of the rapidly changing nature of electronic commerce, some of these innovations will undoubtedly press the very definitions of banking. Of course there are also risks, along with the new business opportunities. These risks will continue to require careful monitoring and management. To do otherwise would undermine the hard-won confidence that once lost is not easily regained. Payments Turning to payments, traditional payment mechanisms such as currency and checks have held the field against many challengers for more than a century. Undoubtedly the confidence that has been built up in traditional payment instruments has played a major role in their continuing success. Very interesting innovations, however, are being announced almost every day. Many of these innovations are being driven by efforts to improve the convenience of payment instruments and systems, including the timing and availability of electronic payments. It might be instructive to review briefly the history of retail payments to understand how the tradeoff among convenience, confidence, and complexity has worked. Looking back, the check was used in North America as long ago as colonial times. Businesses, in particular, were early users. The widespread use of the check by consumers did not occur in the United States until after World War II. Rising levels of income and restored confidence in the banking system led to the growth of deposit banking. Checks were used increasingly to make purchases over the counter as well as to pay bills. Checks allowed users to make payments for small as well as very large amounts at any time of the day, without needing to visit a banking office to obtain cash. Checks also allowed users to pay bills without visiting physical locations designated by service providers such as utility companies and other major billers. Thus checks offered more choices regarding the time and location for making payments and, at the same time, reduced the risk of theft and loss associated with cash payments. Ironically, innovations such as automated teller machines, which are not payment instruments but delivery mechanisms for cash, may well have supported the use of cash relative to checks or newer forms of electronic payments. ATMs initially offered a key banking service - cash withdrawals around the clock. With the latest surge in deployments, ATMs seem now to be located on nearly every street corner. Over the longer term, the expansion of ATM networks and their integration into broad “point-of-sale” networks may ultimately improve the convenience of and increase the demand for on-line debit cards. In countries such as Canada and the United Kingdom, 20% or more of non-cash transactions are now made by debit card over nationwide networks. Credit cards offer another interesting example. Credit cards began more than 75 years ago as store charge cards. These cards received a boost in the 1960s with the creation of branded bank cards and have since grown in popularity. The cards can be used on a 24-hour basis. Initially, the locations where they could be used were limited, but these have grown significantly along with overall credit card use in recent years. Consumers and merchants have now widely adopted credit cards to make payments arising from electronic commerce. There have also been efforts to make the use of credit cards over open networks more secure and to increase protections to cardholders. Other attempts at payment innovations also suggest that convenience, confidence, and complexity are important. The automated clearinghouse, designed to provide a very low cost electronic payment mechanism, has been very successful in automating many types of recurring payments. Early uses of the ACH, however, did not generally provide for flexible interfaces with consumers and businesses. To make an ad hoc electronic payment over the ACH, for example, would generally have required a special trip to a full-service banking office during regular business hours. From the standpoint of timing and location for making such types of payments, the check was clearly a superior instrument for consumers and many types of businesses. Some recent innovations such as point-of-sale check truncation and electronic bill payment systems now provide interfaces between the ACH and consumers and businesses that may significantly stimulate the use of the ACH over the longer term. In pilot tests of stored-value products, consumers have been able to use innovations such as stored-value cards only at very limited numbers of locations. There have been no real market tests yet of cards that can be reloaded at home computers or telephones. In theory, this capability could be equivalent to placing an ATM in every household. On balance, because consumers have not perceived the characteristics of stored-value cards to yet equal or improve on those of cash, it is no wonder that the cards have not done well commercially in early trials. However, providers of stored-value products have an incentive to make the use of those cards more attractive than cash in terms of the tradeoff among convenience, confidence and complexity. If they do that, it is quite possible that future tests will be more successful. Potential lessons and innovations Our experience with innovations in the payment system suggests several lessons. First and foremost, an innovation should have a “value proposition” that works for both providers and users. Providers must be able to earn a competitive return on the product, otherwise they will have no incentive to supply the innovation to the marketplace. For users, innovative products will need to offer combinations of convenience, confidence, and complexity in making payments that provide advantages over existing payment instruments and systems and to be competitively priced. Innovations that simply offer greater convenience but lower confidence may not be successful. Conversely, innovations that offer somewhat less convenience but improve confidence factors such as security and privacy may also be less than successful, at least initially. The final judgments in these cases, however, will have to be made by consumers and businesses in the marketplace as they weigh different variables against each other. Second, innovations in payment systems may provide new ways of doing business for providers and users that go beyond the process of payment itself. Electronic payment mechanisms, in particular, may lead to the combination of financial, payment, and other activities in new ways, particularly if data are brought together at one time and location for users. Indeed, new software offers low-cost opportunities to combine data and activities in ways that may not even have seemed practical a few years ago. Current electronic bill payment services and projects are one example. It is becoming increasingly convenient not only to make on-line bill payments but also to combine this activity electronically with financial analysis, cash and investment management, record management, and related functions. Third, electronic payment systems typically require a communications infrastructure along with technical, business, and legal rules in order to function effectively. The advent of the Internet and other types of network services may reduce the cost and complexity of putting such infrastructure in place. Relatively little new infrastructure was required to use the Internet and existing credit card networks as communication tools for making credit card payments to support electronic commerce, and growth has been rapid. Other innovations may also be able to build on the Internet and established payment networks, such as the ATM and ACH networks, in order to expand the range of payment options in electronic commerce. Fourth, economic “network effects” may be important in determining which innovations succeed or fail, at least in the short run. In general, one aspect of a network effect is that the value of a network to its users increases as more users join. We are familiar with this effect in the telephone and other communications markets. In payments, if too few consumers or merchants use a payment network or a new instrument, the system may not be sufficiently valuable to its users for it to become economically viable. To date, some innovations such as stored-value products may have been less than successful in part because of these effects. The role of government and the Federal Reserve Despite some of the challenges in shifting from a paper-based to a more electronic payment system, it is clear that the United States fundamentally has a safe and reliable retail payment system. As that system continues to evolve, the private sector will play the pivotal role in most innovations, while the Federal Reserve will also continue to play a strong and important role. In general, government, including the Federal Reserve, must continue to foster the safety and soundness of the payment and financial system, promote competitive markets, and ensure adequate levels of consumer protection. The Federal Reserve can also continue to modernize its existing payment services and work with the private sector to identify and, when appropriate, address barriers to payment-system innovation. Last July, the Board announced the formation of the Payments System Development Committee, which I co-chair with Cathy Minehan, the President of the Federal Reserve Bank of Boston. This new group is focusing on key medium- and long-term public policy issues surrounding the development of the retail payment system. In particular, the Committee is seeking to work with the public to identify barriers to the future development of the payment system and to recommend solutions to the Board and other authorities. During this year, the Committee is focusing on four important areas relating to retail and low-value commercial payments. First, we are attempting to learn from both Federal Reserve and private-sector experience with truncation and electronic check presentment, and to identify barriers to greater use of electronic technologies to collect checks. Second, we are assessing gaps in standards that may be inhibiting payment system innovation. Third, we are reviewing legal and regulatory issues, with an emphasis this year on the legal underpinnings for converting checks to electronic payments. Finally, we are examining the long-run strengths and weaknesses of the clearing and settlement systems for electronic payments. In addition, the Committee is following with great interest the many payment innovations that are currently taking place in the market. Most importantly, the Payments System Development Committee is seeking to foster communication with the public about the development of the retail payment system through meetings, workshops, and other forums. Conclusion Overall, a number of innovations are taking place in the retail payment system, along with very creative thinking by both traditional and non-traditional participants. Many of these innovations closely mirror much broader developments in electronic commerce and banking. Payment system innovations that improve efficiency and confidence are welcome developments. Because of the complex nature of our economy and the fundamental role of the market, many of these innovations will necessarily come from the private sector. A particularly important challenge for the Federal Reserve is to find effective ways to work with the private sector to identify and address genuine obstacles to innovation so that today’s promise can give rise to tomorrow’s achievements.
|
board of governors of the federal reserve system
| 2,000 | 5 |
Speech by Mr Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the New Economy Forum, held at Haas School of Business, University of California, Berkeley, on 9 May 2000.
|
Roger W Ferguson, Jr: Conversation with leaders of the “New Economy” Speech by Mr Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the New Economy Forum, held at Haas School of Business, University of California, Berkeley, on 9 May 2000. * * * Thank you for inviting me here today to discuss with you the “new” economy and its implications. As always, the views I will be expressing are my own and are not necessarily shared by other members of the Board of Governors or the FOMC. With my normal disclaimer out of the way, I’d like to turn to a brief review of the extraordinary performance of the US economy over the past five years. Since 1995, real gross domestic product has grown, on average, more than 4¼% per year. This is significantly above the pace in the previous five years, and you have to go back to the decade of the 1960s to find even closely comparable periods of consistently robust economic expansion. In this environment, the unemployment rate has fallen to 4%, and the underlying rate of price inflation has slowed, on net, despite very high rates of resource utilization. Even the most optimistic of forecasters could not have anticipated such a favorable confluence of economic events. Productivity growth and cost reductions So, what happened? As a policymaker, I’d like to think that well-executed monetary and fiscal policies played some role in creating an economic environment that was conducive to non-inflationary economic growth. Our economy has also benefited from past actions by the government to deregulate industries. The removal of unnecessary government regulation started more than twenty years ago, during the administration of President Ford, and gathered momentum during the Carter years. It has altered the business landscape. Deregulation allowed, indeed forced, businesses to focus more clearly on a marketplace that has become more competitive, with fewer constraints and increased flexibility. But the dominant force of late appears to have been a significant upshift in the rate of productivity growth. After increasing 1.6% per year from 1990 to 1995, output per hour in the non-farm business sector - a conventional measure of productivity - has increased at an annual pace of about 2.6% since 1995. Cyclical forces - such as the inability of businesses to add to their payrolls as rapidly as they would have liked in response to the rise in demand - have probably played some role in these efficiency gains. But I suspect that longer-term, structural changes, reflecting the boom in capital spending and the revolution in information technology, probably have been more important. I will return to the evidence of this point shortly. Why are the growth rate of productivity and the sources of that growth so important to policymakers? Very simply, economic theory indicates that, over the long run, faster growth of labor productivity allows faster growth of real wages - that is, wages adjusted for inflation. Without faster growth in productivity, businesses faced with a nominal wage bill that is increasing faster than productivity would be tempted to pass those increased labor costs to consumers in the form of higher prices for goods and services in order to protect profit margins. Adding the growth rate of labor productivity, or output per hour, to the growth rate of the number of labor hours gives an approximation of the rate of increase in the economy’s ability to create goods and services. Since labor hours tend to be determined in the long run by increases in the working-age population, growth in productivity is the focal point. Why then are the sources of productivity growth important? If that growth in productivity is due to a change that outlasts the business cycle, then economists and policymakers can have confidence that the productive potential of the economy has changed. Economists speak of that as “trend” productivity and the resulting growth rate as the “trend” growth rate. The structural changes that I mentioned above have had effects beyond increasing the rate of productivity growth. They have also enhanced the ability of businesses to reduce their operating expenses. In many industries, investments in information technologies have helped firms to cut back on the volume of inventories that they hold as a precaution against glitches in their supply chain or as a hedge against unexpected increases in aggregate demand. In fact, we have seen that the ratio of inventory to sales and inventory to shipments have both trended down since 1995. Product development costs have probably also been reduced through the use of better computer hardware and software, and new communications technologies have increased the speed with which firms can share information - both internally and with their customers and suppliers. This intense focus on cost reduction has been an important element in helping to head off the development of inflationary pressures in this expansion. Moreover, given intense competition and the resultant lack of pricing “leverage”, ongoing programs to reduce costs have become a key part of corporate strategies to maintain or improve profit margins. Technology change and productivity growth Bob Solow - the MIT economist who won the Nobel Prize in economics for his work on the theory of economic growth - once quipped that you can see computers everywhere except in the productivity statistics! That situation has recently begun to change, and we now have strong evidence that the productivity growth that our economy has experienced is in fact due in part to newer technologies. Research by two economists on the Board staff - Steve Oliner and Dan Sichel - sheds some light on the sources of this faster productivity growth. About one-half of the 1 percentage point increase in productivity growth over the 1995-1999 period can be attributed to so-called “capital deepening”. As everyone here is well aware, providing your workers with more equipment improves his or her efficiency. Likewise, at the aggregate level, the high (and rising) levels of business investment raised the amount of capital per worker and thereby boosted productivity. It is also interesting to note that most of the capital deepening reflected greater spending by businesses on high-tech equipment: computers, software, and communications equipment. Another 1/2 percentage point of the pickup in productivity growth reflected technological innovations in the actual production of computer hardware and semiconductors as well as better management - perhaps assisted by these high-tech investments of the nation’s capital and labor resources. Oliner and Sichel estimate that, if one consolidates all the influences of high-tech investments, they account for about two-thirds of the acceleration in productivity since 1995. This research supports the view that fundamental changes are under way in our economy. But technological waves ebb and flow, and it is natural to ask whether we can count on such rapid productivity growth in the future. On this score, I am cautiously optimistic. But, as an economist, I need to see hard evidence of actual ongoing productivity gains or cost reductions in the economic statistics to truly believe that the world is continuing to change in a fundamental way. I am confident that the efficiency gains that have already been achieved are permanent: the investments have been made, the technologies are in place and are being disseminated, and production is proceeding apace. But I think that it is wise to get support for the assertions that all of the new technologies and business practices now coming to the fore will prove to be as revolutionary as some of their marketing materials suggest. Clearly, there is great potential to improve efficiency using Internet-based e-commerce strategies such as electronic marketplaces and business-to-business supply chain management. But no one really knows how big those productivity gains will be, how long they will take to be realized, and who will be the ultimate beneficiaries. Have not other technologies emerged over the past fifty years - such as the television, the jet engine, or even air conditioning - that were equally revolutionary? Indeed the transatlantic cable and telephone probably revolutionized communications as much as any other technology. The Internet has attracted the most media and public attention as a symbol of the new economy. It clearly improves communication, collapsing time and space, but are we overstating the potential benefits of this one, admittedly stunning, innovation? Does the Internet have the potential to continuously improve business processes, as some enthusiasts argue, and if it does, what conditions are required to achieve that? I hope that, given the expertise of the participants at this symposium, we can return to these and related questions in the discussion period. The macroeconomic implications of faster productivity growth A step-up in the growth rate of technological change certainly would have important implications for economic activity and inflation. As I indicated above, the main reason policymakers and economists are interested in the growth rate of productivity is that understanding that rate gives a clear understanding of the economy’s potential to supply goods and services. Where would we look for corroborating evidence of this improved growth rate in technological change? The most immediate effects would be on capital investment. A more rapid pace of technological change raises the real rate of return on new investments - perhaps significantly. Put another way, a more rapid pace of technological change makes investments in capital goods embodying the new technology more profitable. When businesses recognize the new technological possibilities, capital spending accelerates to take advantage of the new profit opportunities. Businesses can better produce more output with the same labor input. While supply-side effects are clear, a new higher level of productivity growth would also affect the demand side of the economy. The employment and income generated by business spending on capital goods boosts consumer spending and sets off another round of investment spending. Typically referred to by economists as “multiplier-accelerator” effects, such processes would continue as long as the real rate of return on a new capital project exceeded the real cost for capital for that project. This is the process through which an innovation on the supply side of the economy generates a comparable increase in aggregate demand. Theory also teaches that the increase in the rate of return on capital - even if generated by a rise in the growth rate of technical change - ultimately requires an increase in real market interest rates. Market interest rates must rise in order to maintain equilibrium between the demand for investment funds, which increases, and the supply of investment funds. And, indeed, we have seen that market interest rates, particularly for corporate issuers, have risen steadily for the last year or so. This somewhat abstract description of the effects of a step-up in the growth rate of technical change bears a striking resemblance to the developments in labor markets, prices of goods and services, capital investments, and fixed-income markets of recent years. But there’s still an element missing. How does the performance of the stock market in recent years fit into this picture? A higher rate of technical change that raises the productivity and hence the profitability of capital should elevate the valuation of equities. But how much should stock values rise under those circumstances? Are stocks today overvalued, correctly valued, or undervalued? I certainly do not know, and I am not aware of anyone who does. As a result, I believe that it would be unwise - and indeed impossible - for the Federal Reserve to target specific levels of valuations in equity markets. However, equity markets obviously do have spillover effects on the real economy and, thus, need to be considered in assessing the aggregate balance of supply and demand. Given the efficiency and forward-looking nature of financial markets, even future technical innovations will have an immediate effect on equity valuations. Equity valuations in turn can influence consumer behavior. As you know, economists often speak of the “wealth effect”, and econometric modeling indicates that consumers tend to raise the level of their spending between 2 and 5 cents for every incremental dollar of wealth over a period of two to three years. As a consequence, equity valuations can have a noticeable effect on consumption and on macroeconomic performance. Additionally, equity markets are a source of investment capital, and valuations in the stock market are one determinant of the cost of capital for businesses. To put a rough number on these influences, simulations by the Board staff using our econometric model of the economy suggest that wealth generated in the equity markets over the last four years added about 1 percentage point to the growth rate of real GDP. Some particularly enthusiastic observers of the “new” economy argue that inflationary pressures are no longer a risk. I firmly believe that we should recognize that, even in a high-productivity economy, stresses and imbalances might emerge. In the present context, the most obvious indication of an imbalance is the current account deficit, which is both large and growing. This means that we are financing investment with savings from overseas. The other indicator of an imbalance between demand and supply growth is the gradual decline in the unemployment rate over the last few years. It may be that this imbalance has served only to bring the unemployment rate to a new and lower sustainable rate, but it is also true that the wedge between demand and supply growth cannot continue indefinitely because, once pressures on limited resources rise sufficiently, inflation will start to pick up. Monetary policy and the “new” economy As I have said many times before, uncertainty about productivity trends is a major challenge in the design and implementation of monetary policy. As you can imagine, it is very difficult to infer the true structure of the economy through the interpretation of the twists and turns of incoming economic data. How do we know, for example, if unexpected developments are just temporary movements away from stable longer-run relationships or are manifestations of changes in the underlying economic structure? In many cases, this judgment is difficult to make with much confidence even considerably after the fact. In the meantime, we must bear in mind that the statistical relationships we work with, embodied in our econometric models, are only loose approximations of the underlying reality. The considerable uncertainty regarding statistical constructs such as the “natural” rate of unemployment or the “sustainable” rate of growth of the economy suggest, in my judgment, the need to downplay forecasts of inflation based solely on those variables. Some fog always obstructs our vision, but when the structure of the economy is changing, the fog is considerably denser than at other times. What should be done when such uncertainties seem particularly acute? When we suspect that our understanding of the macroeconomic environment has deteriorated, as evidenced by strings of surprises difficult to reconcile with our earlier beliefs, I think that the appropriate response is to rely less upon the future predicted by the increasingly unreliable old models and more upon inferences from the more recent past. That means we should weight incoming data more heavily than data from decades past in trying to make judgments about the new economy and, of course, act appropriately when trends become clear. But, even for those of us who take this more pragmatic approach, there are many serious challenges. Economic data are notoriously volatile, are easily affected by a variety of special factors, and are subject to revision as more reliable or more complete sources become available. For example, there are several estimates of the growth in real GDP in any particular quarter. The so-called “advance” estimate contains numerous assumptions about missing source data. One month later, the “preliminary” estimate is produced with a more complete information set. And, one month after that, the “final” estimate is generated. But that’s not the end of the story. Once a year the GDP data are revised again to incorporate the results of source data that are only available annually. Thus, over this revision window, the picture painted by the GDP data can change significantly, and policymakers obviously need to be aware of this to avoid attaching too much significance to any one piece of data. Moreover, it is at uncertain times such as these that the wisdom underlying the institutional structure of the FOMC becomes most apparent. A committee with broad representation can bring a variety of perspectives and analyses to bear on difficult economic problems. In addition, the anecdotal reports that the presidents of the Federal Reserve Banks bring from their Districts are especially valuable in the decision-making process because they afford a “real-time” sense of what is going on in the economy. Such diversity of information sources becomes particularly useful when our earlier assessment of the economy’s structure has been drawn into question by surprises, even pleasant ones. Even in a period of some uncertainty, monetary policy authorities have an important responsibility to remain vigilant with regard to inflationary pressures. Since in the long run there is no tradeoff between unemployment and inflation, we know that keeping inflation low and stable and maintaining an obvious stance of vigilance vis-à-vis inflation, so that inflation expectations are also relatively low, is the main value that a central bank can add to this equation. Besides the issue of how monetary policy should respond to a productivity shock, questions have recently resurfaced about the effectiveness of any actions that the Federal Reserve might take. Some analysts note that economic growth has not slowed even though the FOMC has raised the federal funds rate five times over the past year, and they draw the conclusion that the central bank has lost its effectiveness. I do not share that view. There have always been lags between the initiation of a monetary policy action and its effect on the economy. And, as Milton Friedman pointed out many years ago, these lags are “long and variable”. Unanswered questions As I promised, I will now pose several questions about the new economy to this gathering. I can assure you that this will not be a multiple-choice test, and it will ultimately be up to your shareholders and the American people to grade all of our answers. First, what has been so special about the technological developments since 1995? How quickly have the innovations to computing and communications technology diffused throughout the economy? How much diffusion of the current technologies remains to be accomplished? Second, what makes the Internet unique in its potential to improve productivity, and is the potential greater than that of other recent technological developments? What are the potential benefits - and costs - of the adoption of the commercial and communication strategies required to fully use the Internet? Does the Internet have the potential to continuously improve business processes, as some enthusiasts argue, and if it does, what conditions are required to achieve that? Third, how have you used information technology or the Internet to improve productivity or reduce costs in your own businesses? How far along are you in this process? How important is spending on research and development to the long-run competitive position of your own business and the pace of innovation in your industry? Are new technologies emerging from R&D that have the realistic potential to increase productivity growth in the economy even further? Fourth, do the methods used to value the so-called dot-coms differ from those used to value “old economy” companies, and should they differ? How should gross margins be considered in valuing dot-coms? Do tools used to value direct-mail companies apply to dot-coms? Given recent volatility in equity prices and the IPO market, are venture capital funds less forthcoming? Will new ventures still emerge at the same pace in a period of equity market volatility? Concluding remarks In conclusion, let me remind you that, while these are challenging times for monetary policymakers and financial market participants, the US economy is enjoying a period of unprecedented prosperity. Technological developments associated with the information revolution are truly transforming the way we work and play. Our job at the Federal Reserve is to do our utmost to produce a stable economic environment without inflation so that these trends can continue. I look forward to discussing these issues with you.
|
board of governors of the federal reserve system
| 2,000 | 5 |
Remarks by Mr Roger W Ferguson Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the Cleveland Millennium Conference 2000, held in Cleveland, on 11 May 2000.
|
Roger W Ferguson, Jr: Economic policy for our era - the Ohio experience Remarks by Mr Roger W Ferguson Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the Cleveland Millennium Conference 2000, held in Cleveland, on 11 May 2000. * * * What makes an economy world-class? A world-class economy, as I understand the term, is an economy that successfully competes at the international level. I doubt whether there are many places in this nation that have as clear a perspective on the world economy as does northeast Ohio. One-quarter of the nation’s manufacturing output is produced in an area that lies within a half day’s drive from Cleveland. The region generates more than 40% of the nation’s transportation equipment, 30% of its industrial machinery, and 44% of its metals industries that make up an important part of the nation’s re-energized trade sector. Consider that Ohioans exported more than $2,200 in merchandise per person in 1997. About one in four dollars of metalworking machinery production, of which this region is a major contributor, was exported. Steel mill producers in Ohio - another of your revitalized industries - have more than doubled their export volumes since the mid-1980s. And in the transportation equipment industry, the foreign-owned Honda assembly plant in Marysville, Ohio, which produced roughly half a million cars in 1998, is the largest automobile assembly plant in North America. Today, I would like to offer some observations, from the perspective of a policymaker, on the events that have already transformed much of the national and regional economies and that are continuing to reshape business around the globe. Specifically, I want to reflect on the changing role of economic policy in our current environment of rapidly improving communications and expanding markets. To sustain the gains that this region and other regions have made in the past decade and to best ensure our continued global competitiveness, we need to fashion economic policy that, above all else, helps to facilitate communication through efficient and effective markets. Recent economic developments The national economy is enjoying an impressive period of prosperity. National income, after adjusting for inflation, has grown about one-third since 1991 - or by about 3½% per year. US joblessness has fallen to a level not seen in thirty years, and wealth is being created at a pace rarely seen. Growth in this region has been even more impressive. On a per capita basis, Ohioans have seen 5% more income growth than the nation during the past eight years. Economic strength is also reflected in local labor market indicators. After many years of sub-par performance, and occasional periods of outright decline, the net growth of jobs in the region has kept pace with the exceptional US average. Even more telling has been the remarkable pattern of the local unemployment rate. After averaging more than 1 percentage point above the national average in the 1980s, joblessness in the Cleveland area fell under the US average in 1990 and has remained at, or below, the national benchmark every year since. The recent prosperity of the region reverses the previous twelve-year period of economic decline relative to the nation in a rather dramatic way. This decline, not so flatteringly referred to by some as the “rust-bowl era”, took its toll on labor and business alike. After peaking in the early-1970s, the population of the six county area surrounding and including Cuyahoga County declined annually for nearly two straight decades. But since 1990, more families have been arriving than leaving, which can be due only to this area’s rejuvenated economy. What accounts for this remarkable reversal in economic fortune? As I have said recently, on the national level, and in this region as well, the dominant force of late appears to have been a significant upshift in the rate of productivity growth. Having increased 1.6% per year from 1990 to 1995, output per hour in the non-farm business sector - a conventional measure of productivity - has risen at an annual pace of about 2.6% since 1995. Cyclical forces - such as the inability of businesses to add to their payrolls as rapidly as they would have liked in response to the rise in demand - have probably played some role in these efficiency gains. But I suspect that longer-term, structural changes, reflecting the boom in capital spending and the revolution in information technology, probably have been more important. Through this increase in productivity, our national economy has successfully prepared itself to take advantage of the rapid globalization that has characterized the current economic expansion. While private decisions rightly deserve primacy in any discussion of the current economic climate, they were taken against the backdrop of important policy decisions. I believe that this productivity increase might not have occurred were it not for the policy adjustments that were made starting in the late 1970s and continuing even today. Furthermore, the opening of many nations’ economies to our goods and services reflects, in my judgment, the fact that the world’s policymakers have, in general, abandoned the economic policies that were found to be counterproductive. In the end, free trade, deregulation, sound fiscal policy, and sound monetary policy have all played a role in the strength of the US economy. These same factors are emerging as equally important in other economies. Economic prosperity, trade and global integration In economics, nothing is more fundamental than trade. Trade allows us individually, and as a nation, to devote our scarce resources to their most advantageous uses and then exchange our products with others to satisfy our diverse preferences. This process allows specialization, and it is what gives rise to the existence of markets. The lifeblood of trade is communication. Communication allows us to find the most profitable outlets for our products and suppliers for our needs and wants. The greater our capacity to communicate, the greater our ability to specialize, the broader our expanse of markets, and the more prosperous we become. These are not new ideas. They have been our understanding of how nations become wealthy since being described by Adam Smith more than two hundred years ago. Today, we are experiencing a great technological revolution - a communications revolution. The proliferation of microprocessors and other innovations of the past several decades has dramatically lowered the costs of getting and transmitting information. Predictably, the new communications technology has brought with it a growth of new markets. This great expansion of markets has allowed the US economy to improve its allocation of resources by shifting them to their most internationally competitive uses. It also seems probable that these new communications technologies have brought greater openness in global markets by helping us to break down the complex and unproductive network of artificial trade barriers that characterized much of the previous century. The role of international trade and finance in bringing renewed prosperity in the past decade to the economy of Ohio is noteworthy. From 1987 to 1997, Ohio’s exports grew 60% faster than exports overall in the United States - and US export growth was very strong indeed. By 1997, the state had jumped from being the eleventh highest export state to being the seventh. And in 1996, the Cleveland area ranked twenty-third in the top seventy export communities in the nation. This region’s influence in the world economy appears to be still growing as its capital base expands. Data from the US Bureau of the Census indicate that, between 1982 and 1996, the amount of new capital added in Ohio industry grew as a share of all US capital additions. Specifically, while US industry was adding about 4¼% per year to its stock of industrial capital, Ohio was adding capital to its industry at a 5% clip. In 1998 and 1999, slightly more than 2,100 new major projects were begun in Ohio, which puts the state among the top five states in attracting and expanding business. Moreover, about 6% of these business expansions were financed by foreign investors, of which slightly more than one-third were Asian investors and about one-half were European. The Ohio Department of Development estimates that 851 foreign-owned corporations provided only slightly less than one in twenty jobs in the state last year. Almost 75% of the foreign establishments were in the manufacturing sector, where trade opportunities have been the greatest. And the single largest regional concentration of foreign-owned businesses in the state was in Cuyahoga County, with 145 establishments. What has this investment wrought? Today, output per hour in the region’s manufacturing sector is hardly reminiscent of the economy of fifteen years ago. In industrial machinery manufacturing, for example, new capital expenditures almost doubled between 1987 and 1996, well in excess of the national average. At the same time, the productivity of Ohio industrial machinery workers jumped from more than 10% below the national average to more than 10% above the national average. This story can be repeated for a number of industries throughout the region. The cost of growth Economic transformation has not come without cost. Between 1977 and 1987, US industry reduced production jobs in manufacturing by 1.4 million workers. More than 200,000 - or 15% - of those jobs were in Ohio. Of those job losses, over half were centered in two industries - primary metals manufacturing and industrial machinery manufacturing - each losing more than 50,000 jobs over the decade. In fact, the region’s new competitiveness could probably not have occurred were it not for the dramatic changes this area’s economy experienced in the 1980s. Is economic progress possible that does not make obsolete the methods and practices of the earlier, less efficient economy? In his 1950 book Capitalism, Socialism, and Democracy, economist Joseph Schumpeter described capitalism as a system “that incessantly revolutionizes the economic structure from within, incessantly destroying the old one, incessantly creating a new one.” Schumpeter saw that economies continually bounce from one growth path to another, all the time remaking themselves. He coined the phrase “creative destruction” to describe this process. Simply put, economies are constantly under competitive pressure to re-invent themselves. As they move toward higher levels of productivity, they necessarily make other production technologies obsolete. Schumpeter cautioned that economic policymakers who fail to appreciate the relationship between the relentless churning of the competitive environment and wealth creation will end up focusing their efforts on the methods and skills that are in decline. In so doing, they establish policies that are aimed at protecting weak, outdated technologies, and in the end, they slow the economy’s march forward. In retrospect, we can tell that some economic policies of the past century have inadvertently, or in some cases intentionally, done just that. They have had the effect of directing or misdirecting economic growth by either substituting policymakers’ judgment regarding the distribution of an economy’s assets for the combined wisdom of individuals or allowing markets to send false signals. In the long run, such policies were destined to fail. The economic policies of the last century A very broad reading of economic history reveals that policymakers in many countries during the last century attempted to manipulate trade and other forms of economic activity by altering, artificially, the measures of value, that is, prices. One such policy followed by some countries during the last century was known as the “beggar thy neighbor” policy, the manipulation of the exchange rate in order to boost a country’s exports. Trade restrictions were also often used to protect domestic industries from imports. A final example from the international sphere is the system of global fixed exchange rates that emerged following the Second World War. To blunt market forces, fixed exchange rates were usually accompanied by capital controls that tried to manage the inflows - and more importantly the outflows - of a nation’s investment funds. Ultimately, this system of global fixed exchange rates worked poorly and could not withstand the market forces that emerged in the 1970s. In a similar spirit, some economies used taxes or other incentives to promote one industrial activity or discourage another. Obviously, the most egregious form of this policy was in planned economies. But many democratic economies, as they recovered from various wars and other national traumas, nationalized entire industries. In our society, we never found that degree of government intervention appropriate, but we did regulate some business decisions for certain industries, such as electric power distribution and airlines, attempting to overcome the “natural monopoly” or “excessive competition” characteristics perceived in these industries. Finally, some central banks in the past engaged in policies that artificially altered the path of domestic prices in their effort to regulate their business cycles. If the monetary authority wanted more growth above trend, it lowered money-market interest rates by expanding the stock of money. Such policies were expected to bolster demand and encourage an acceleration of growth. There was the misunderstanding that somehow a long-run tradeoff existed between inflation and unemployment. But it gradually became understood that inflation eroded investor and consumer confidence and distorted behavior, both because the average of prices gave a constantly depreciating reading of the values it was supposed to represent and because relative prices provided an inaccurate reflection of comparative worth. Monetary policies that intended to create growth through the inflation of prices ended up impeding markets and reducing economic prosperity. We now know that there is no long-run tradeoff between inflation and unemployment. The US experience of the last several years has also taught us that low and stable inflation is the underpinning for sustainable growth and that sustainable growth fosters the maximum creation of jobs over time. Emergence of the communications era In recent decades, trade restrictions, “beggar thy neighbor” policies, and the pursuit of a supposed long-run tradeoff between inflation and unemployment have all been called into question and generally rejected. In part because of the communications revolution and the substantially reduced costs of transacting from great distances, businesses have sought more globally integrated production processes, and investors have required the development of financial instruments to facilitate their demand for international portfolio diversification. Such developments have put enormous pressure on policymakers to loosen their grip or abandon policies that led to the misallocation of resources. Tariffs have been reduced, and restrictions on the flow of goods have been eased. Controls on the flow of investment capital have been eliminated in most industrialized countries, and they are rapidly coming down in many developing nations as well. In some cases, these changes were more or less forced upon the nations that adopted them. But in many instances the policies have been liberalized because of the realization that markets allocate resources more effectively than governments. Trade is flourishing, gaining great momentum in the ten years since the fall of the Berlin Wall. Total trade with foreigners now accounts for about one-quarter of total US national output - more than twice the share of the period between 1920 and 1970 and the largest trade share for the US economy in more than a century. Not coincidentally, the economy has been expanding at a strong and steady rate. In addition, our economy has benefited from past actions by the government to deregulate industries. The removal of unnecessary government regulation started more than twenty years ago, during the administration of President Ford, and gathered momentum during the Carter years. It has altered the business landscape. Deregulation allowed, indeed forced, businesses to focus more clearly on a marketplace that has become more competitive, with fewer constraints and increased flexibility. If economic policy is to play a constructive role in building a new world economy, policymakers must increasingly focus on policies that eliminate barriers to communication and allow the market to work most efficiently and effectively. They must develop approaches that do not hinder “creative destruction” but appropriately cushion its impact on workers and communities. They can encourage the information revolution by fostering policies and approaches conducive to giving investors and consumers the information they require to make informed decisions. For example, the Federal Reserve and the Basel Committee on Banking Supervision have strongly supported initiatives to improve the quality of national and international disclosure practices. Credible financial statements and other disclosures are key means for communicating a company’s operating results and its overall health, as well as for making more transparent various operating activities. Regarding monetary policy, central banks around the world are now endeavoring to provide stability to their domestic price levels. In some cases, this focus on price stability was undertaken in order to return credibility to the central bank after a period of unacceptable inflationary pressures. The Federal Reserve, with our mandate, must also seek to facilitate the transmission of the information that the price level is meant to convey. By maintaining a stable purchasing power for money, workers and firms will more clearly see the values being attached to their opportunities and more effectively make judgments about the allocation of their resources. This is a monetary policy that does not attempt to alter the information being transmitted by the marketplace but to increase its clarity and consistency. The increased openness of Federal Reserve decisions - reflected in announcement policies aimed at more rapid and transparent dissemination of Federal Open Market Committee decisions - also needs to be appreciated as a way to facilitate the communication to and within the marketplace in order to promote the most effective policy possible. Policies for a communications era - a local perspective This perspective on economic policy extends beyond the establishment of the national monetary policy that occupies much of my time. A popular bumper sticker says, “Think Globally, Act Locally.” Good advice. Indeed, this simple maxim describes one of the great strengths of the Federal Reserve System. Although many tend to think of the Federal Reserve as a Washington-centric institution, it is, in fact, a structure of twelve independent regional Reserve Banks, one of which is just a few blocks from here, teamed in harness with the Board of Governors in Washington. Reserve Banks have always had an important role in channeling regional economic information into the deliberations of national economic policy. Today, they take those responsibilities a step further. In closing, let me give a few examples of some of the local programs that are conceived in this spirit. The latest data from the US Census Bureau indicate that Cleveland and Northeast Ohio lag behind other metropolitan areas in small business growth. Linked by a desire to improve the success rate of small business, the Federal Reserve Bank of Cleveland, the US Small Business Association, and the Greater Cleveland Growth Association’s Council of Smaller Enterprises in 1997 started the Access to Capital Initiative. The purpose of the initiative was to narrow the gap between the need for and the availability of startup and expansion capital in Northeast Ohio. This collaborative, comprehensive approach to identifying gaps and barriers to capital access and developing a strategic, community-based plan to address those deficiencies has led to the creation of the Access to Capital Network. The network will be an umbrella organization that will link small and midsize businesses to the region’s provider of capital and business assistance, primarily through an easy-to-use, interactive software system available free at its web site. The Cleveland Reserve Bank has also worked with community organizations throughout the Fourth Federal Reserve District to develop resources to promote microenterprise. Microenterprises are very small businesses of less than five employees. These businesses can potentially grow and make greater contributions to the local economy if they can find the capital to do so and if they have adequate access to the technical advice that is often provided to them by community-based organizations. The Cleveland Fed provides technical expertise to these community-based organizations and helps them establish partnerships with financial institutions and other community stakeholders. These are not policies that hope to provide preferential access to financial markets, and they are not directed at particular enterprises. Instead, they are designed to provide the forums, contacts, and skill sets necessary to form the relationships that facilitate business growth. More generally, the Federal Reserve hopes to promote a better understanding among policymakers, community leaders, and private-sector decisionmakers about the resources that support successful economic development. Conclusion As an economic policymaker, I believe that “Building a World-Class Economy” isn’t at all about trying to manufacture various economic outcomes. Fortunately, most policymakers have come to recognize that their role in building world-class economies is to help develop the infrastructure through which people communicate. We need to provide the public with the tools that allow them to judge value accurately and to see opportunities with the greatest clarity. Economic policy, including monetary policy, has to be an integral part of the communications revolution that is sweeping the world. These are the policies appropriate for our era.
|
board of governors of the federal reserve system
| 2,000 | 5 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the National Association of Urban Bankers, Urban Financial Services Coalition, San Francisco, (via videoconference), on 25 May 2000.
|
Alan Greenspan: Evolving challenges for bankers and supervisors Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the National Association of Urban Bankers, Urban Financial Services Coalition, San Francisco, (via videoconference), on 25 May 2000. * * * It is a pleasure to discuss with you the evolving challenges for bankers and supervisors posed by financial reform and continuing technological and financial innovation. Today the nation is enjoying the longest-running economic expansion in its history. The expansion has not only been unprecedented in its duration, but also in its strength. Clearly the rapid technological innovation of the past decade has played a strong role in the expansion’s endurance by improving labor productivity and opportunities for businesses to efficiently expand their output of goods and services. Economic growth has also been supported by the strength and stability of our banking system, which recently recorded its eighth consecutive year of record earnings. The agility with which our banking system recovered from the severe difficulties of the late 1980s and early 1990s to fulfill a critical intermediary role is a testament to its resilience. However, as our economy continues to evolve rapidly, banking organizations will need to continue to adapt to the changing needs of businesses and consumers. In particular, with the passage of the Gramm-Leach-Bliley Act, our financial services industry will be better able to meet those demands in the decades to come through prudent innovation. As an example of innovation and adaptation, I note that your own organization is widening its scope to invite membership from the entire financial services industry. Broadening your reach to include insurance, brokerage and other financial services not only reflects market realities but should also assist in your endeavors to improve the economic development of underserved communities and promote the professional growth of your members. While innovations and adaptations in our economy have helped create and prolong the prosperity we are now enjoying, they also bring to bear new challenges. Today I would like to provide a brief overview of what I think are the key challenges, risks, and opportunities faced by bankers and their supervisors. To start, I would like to shift your attention from the vibrant economy and exceptional banking conditions to some of the risks that are always present. For example, the lax standards, excesses and fraud present in the recent few recent bank failures show that even as most banks post record profits, the deposit insurance fund can still experience disproportionate losses from undisciplined institutions. The high cost of recent failures reinforces lessons of the past for banks and supervisors, including the need for continuous vigilance against causes of bank failure such as fraud, credit concentrations and rapid entry into new and unfamiliar activities. In addition to these risks, institutions face the challenge of fighting both complacency and competitive pressures with regard to their lending practices. There are some recent indications that the industry is aware of these challenges. The most recent Federal Reserve Senior Loan Officer Opinion Survey suggests that banking organizations are becoming more sensitized to risk and are continuing to firm their lending practices. In particular, the percentage of domestic banks tightening standards on commercial and industrial (C&I) loans was the largest since the November 1998 survey. In addition, risk premiums on C&I loans have also risen. While firming loan standards and terms is one technique for managing credit risk, sound risk-management systems also involve testing the performance of borrowers under more stressful conditions to reveal weaknesses. When strong conditions largely mask the susceptibility of marginal borrowers, stress testing is invaluable for revealing the magnitude of portfolio risk posed by more challenging economic conditions. Other factors that are important for strong asset quality include maintaining adequate pricing amidst fierce competition from other banks and non-banks alike. Some organizations are responding to these pressures by moving lower on the credit quality spectrum in a reach for higher nominal yields. In a special question added to the latest lending survey, banks indicated that demand for C&I loans has somewhat strengthened as below-investment-grade borrowers have found unfavorable conditions in the high-yield bond market and have turned to banks as an alternative. The majority of banks reporting additional demand from these borrowers also indicated that they were fairly receptive to these customers. While lending to higher-risk borrowers presents opportunities, banks must ensure that their risk-management systems can properly discern the difference between nominal and risk-adjusted yields, identify credit concentrations, and allocate enough capital and reserves to offset the higher risk of these loans. Banks without these safeguards, as well as limits and adequate tracking and reporting to senior management and their boards, have in the past experienced significant problems and sometimes failure. Credit quality issues are not the only areas of supervisory focus. The trend of narrowing interest margins at many banks has been coupled with lengthening asset maturities and declining core deposits. The erosion in bank core deposits is to some extent attributable to competitive pressure from the marked rise in equity values and mutual funds. In addition, as run-off of lower-cost deposits has been replaced with higher-cost funding from capital markets, pressures on interest margins and liquidity have intensified. Moreover, the decline in stable core deposits and the steady rise in average asset maturities has resulted in higher levels of interest rate risk, a trend evident in our surveillance screens. Banking organizations with strong risk management systems will of course carefully evaluate how these asset/liability trends are affecting their performance and risk profile and take mitigating steps as appropriate. As competitive pressures have intensified, large and small banks alike have sought solutions by broadening the variety of products they offer and delivering them through innovative delivery channels such as the Internet. Interestingly, though conventional wisdom would suggest that smaller community banks would be at a disadvantage in the competitive financial services arena, this has not been the case. By forging cooperative alliances with technology, insurance, brokerage and other firms, community banks have kept up with larger organizations in providing their customers with the diversity of financial tools and products they demand without the attendant fixed start-up costs. Moreover, while larger organizations may have some advantage over smaller banks through their brand identity and larger budgets for technology and marketing, community banks are more likely to have a comparative advantage in understanding the diverse needs of their customer base and in their ability to respond quickly with personalized service. More fully recognizing and responding to customer needs is particularly important when viewed in the context of trends in wealth formation in our economy. The Federal Reserve’s most recent Survey of Consumer Finances suggests that although the current economic expansion resulted in broad gains in median household wealth between 1995 and 1998, families with low-to-moderate incomes and minorities do not appear to have fully benefited. For example, median net worth declined over this period for families with incomes below $25,000, and medians for non-whites and Hispanics were little changed. In addition, lower-income families were less likely to own homes, which constitute the bulk of the value of assets for those below the top quintile according to income. Despite these troubling indications, there were some encouraging signs. In particular, the share of families with incomes below $25,000 fell from 41% in 1995 to 37% in 1998, and more families within that lower-income group reported that they had a checking account. Moreover, home-ownership rates among minorities have risen. For example, according to U.S. Census data, home-ownership rates among blacks rose from 43% in 1995 to 48% in the first quarter of 2000, suggesting that some progress has been made in access to credit for minorities. Although breaking down the barriers that have produced disparities in income and wealth is not simple, promoting equal access to credit for sound borrowers is one step in the right direction. As I have said previously, discrimination is against the interests of business - yet business people too often practice it. To the extent that market participants discriminate, they erect barriers to the free flow of capital and labor to their most profitable employment, and the distribution of output is distorted. In the end, costs are higher, less real output is produced, and national wealth accumulation is slowed. By removing the non-economic distortions that arise as a result of discrimination, we can generate higher returns to both human and physical capital. Banking and other lending organizations that develop expertise to tap, educate, and encourage underserved customers are likely to provide better and more informed access to credit and expand profit opportunities. In that regard, many banks are well equipped to tailor their services to individual customer circumstances in a personalized setting and to educate customers about various products and services that could help them achieve their financial goals. In today’s more complex world, the diversity of financial product choices facing consumers is truly astonishing. Similarly, banks are now also facing much broader choices, especially when one considers the opportunities presented by the passage of the Gramm-Leach-Bliley Act. By modernizing our banking laws and making them more consistent with marketplace realities and the needs of consumers, the financial services industry will be able to grow and innovate with far fewer artificial constraints. How various financial service providers choose to take advantage of the act will be one of the more interesting dynamics as our financial system evolves in the years ahead. Clearly, many franchises can succeed by continuing to focus on traditional banking. Organizations that decide to depart from past successful strategies by expanding into new activities should do so only after careful consideration. As of mid-May, 270 domestic banking organizations and 17 foreign banking organizations had filed to become financial holding companies. Of those, roughly three-quarters had less than one billion dollars in assets. I suspect that many of these organizations are not intending to immediately launch into full-scale brokerage, venture capital, or insurance activities, but rather are looking to keep their options open and retain flexibility should opportunities present themselves. If true, that is encouraging, for it is one thing to gain FHC status and begin cautiously experimenting with these new powers, and quite another to take on the risks related to full-scale acquisitions or extremely rapid growth of new businesses. Whichever configuration financial firms choose, translating the traditional and new financial powers into longer-term economic value for customers and shareholders will be the leading challenge in the coming decades. In this changing environment, supervisors will be challenged to adapt and refine their programs to accommodate both innovations in traditional banking and new activities permitted by the act. The supervisory strategy for addressing banking innovations of the past decade has been a risk-focused approach that moves well beyond earlier, one-size-fits-all examinations to achieve what we believe is a more effective and less burdensome process. The risk-focused approach has also been tailored to distinguish between larger, more complex banking organizations, on the one hand, and the more traditional regional and community organizations on the other. The large, complex companies are receiving a more continuous level of oversight given the rapidly shifting risk profiles that can result from their operations, while well-capitalized and well-managed regional and community organizations receive a greater degree of off-site monitoring and less frequent on-site visitations, consistent with statutory mandates. I should emphasize that these programs are not meant to be implemented in a rigid fashion; institutions that have characteristics that fall somewhere in the middle of the two programs would be flexibly accommodated through adjustments to our supervisory plan for the institution. That kind of flexibility will be essential with the emergence of smaller financial holding companies and the expanded range of permissible activities. Clearly, these two broad supervisory programs will need further customization or segmentation to respond to the evolving diversity of our supervisory caseload. Another area where supervisors are attempting to more appropriately align risks with regulatory approaches is capital. You may be aware that the Basel Committee on Banking Supervision and Regulation is working to refine and improve the risk-based capital measure to make it more sensitive to the underlying risks of various banking activities. As you might surmise, those efforts are likely to result in a revised framework that is geared toward the kinds of exposures and risk-management systems typical of internationally active institutions. For these organizations, the benefits from more precisely aligning risks with capital charges, we trust, will outweigh the substantial costs of added complexity. However, for community banks with traditional exposures, the costs of such a complicated framework will likely outweigh any benefits, and hence it will be impractical to implement. This has led to the consideration of a dual or bifurcated approach to capital that parallels our approach to supervision. Such an approach would recognize the potential tension between the complexity and cost of the next version of the international capital standards and the more limited needs of smaller, more traditional banks. Implementing a second, more streamlined capital adequacy standard for qualifying domestic institutions would seem to have merit. Discussions on such an approach are only preliminary, but the arguments for continuing to more fully calibrate our supervisory and regulatory approaches to the nature and risk profiles of the institutions we supervise are compelling. Capital standards and the supervisory process are two of the three key tools regulators use to get their job done. The third tool, disclosure, holds promise for yielding benefits to our financial system both domestically and globally. In past decades, the business of banking was fairly opaque but straightforward, with banking risks largely embedded in the credit judgments inherent in the loan portfolio. Today, with the explosion in financial innovation that has created various derivative, securitization, insurance, and other structured products and the greater diversity in activities permitted by the Gramm-Leach-Bliley Act, not only are risks more opaque, but even when revealed, sometimes difficult to interpret. Fortunately, the same technology and financial techniques that have created this added complexity can also be harnessed to produce more meaningful disclosures that allow markets to analyze risks and exert discipline on those that would take on imprudent levels of exposure. While not a panacea, improved disclosure can complement the supervisory process and regulatory capital, and obviate more intrusive investigations, holding out the promise of less supervisory intervention. Recently, the Federal Reserve, in collaboration with the Securities and Exchange Commission and the Office of the Comptroller of the Currency, established a private-sector working group to review industry best practices and develop options for improving the public disclosure of financial information by large, complex banking and securities organizations. Enhanced disclosures are also being pursued through regulatory reporting. A proposal to be released shortly will eliminate less meaningful items on the Call Report and request additional information on activities of growing significance for some institutions, including loan servicing, securitizations, venture capital, and insurance. More relevant regulatory disclosures should not only improve transparency but should also help our off-site monitoring and tailoring of our supervisory program. In closing then, we live in a fascinating period in American history, in which rapid change will force business and government to continuously reevaluate previously held assumptions and adapt to change. There is no static, optimum model either for financial service providers or for financial regulators, as both are engaged in a continuously evolving process. I am optimistic that recent financial reforms and continuing innovations will translate into more useful financial products and services to a broader spectrum of consumers and businesses and, in turn, fuller participation by all segments of our society in the kind of economic progress we have experienced to date.
|
board of governors of the federal reserve system
| 2,000 | 5 |
Speech by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, at the Federal Financial Institutions Examination Council, International Banking Conference, held in Arlington, Virginia, on 31 May 20000.
|
Laurence H Meyer: The challenges of global financial institution supervision Speech by Mr Laurence H Meyer, Member of the Board of Governors of the US Federal Reserve System, at the Federal Financial Institutions Examination Council, International Banking Conference, held in Arlington, Virginia, on 31 May 20000. * * * Most of my long-time academic and business friends and acquaintances believe that Federal Reserve governors spend virtually all their time in monk-like contemplation of economic projections and monetary policy. Well, reality is certainly different. Most of my time is occupied by issues concerning institutions and markets, regulations and supervisory policy. In the process, I have learned that the most difficult and most under-appreciated job is yours - bank supervision. When times are good, bankers and policymakers don’t see the need for your services. In not-so-good times, they blame you for not seeing problems soon enough. Adding to the supervisor’s problems is the increasing scale, scope, span of operation, and general complexity of the largest banks operating in the United States - the “global financial institutions” of my title, or, as we call them at the Fed, large, complex banking organizations (LCBOs). These entities are becoming increasingly difficult to supervise and evaluate because of their complexity and opaqueness. The banking agencies have recognized this difficulty and each has developed more-or-less special programs and approaches for the organizations it supervises. Let me underline that these observations are not intended to suggest that regional and community banks are unimportant. Rather, they are intended to convey that the modifications - recent and future required in the supervision of those smaller banks are far fewer than those required for the LCBOs. The capital reforms being developed at Basel, for example, are really addressing developments at complex organizations, and the extent of changes at most other commercial banks in this country will be, I think, quite modest. In the balance of my remarks today, I would like to discuss what I think are the major approaches that we should take in addressing the challenges of supervising the increasingly complex and large global financial institutions. Internal ratings For the past decade or so supervisors have recognized that snapshots of the balance sheets of complex banking organizations are not very helpful for supervisory evaluations. Positions just change too rapidly. Moreover, the complexity of positions implies a major commitment of time and supervisory resources. Thus, all the banking agencies have adopted, in one form or another, an approach that emphasizes careful analysis and evaluation of each bank’s internal risk management policies and procedures, as well as transactions-testing of those policies and procedures. I suspect that a new, and I think evolutionary, supervisory vehicle - one that supplements the evaluation of risk-management systems - will soon be a required part of supervision for all of us. I refer, of course, to the development, use, and application of internal credit-risk-rating systems by banks. Systems for credit-risk rating in one form or another, are widely used by LCBOs for internal management purposes. As they improve, these systems can increasingly be expected to figure prominently in our supervisory process. That dual use - for both management and supervision - is a dramatic innovation, creating a link between bank management and supervisory standards that has been needed for some time. Cutting-edge banks have already begun to classify their loan portfolios into risk classes of finer and finer gradation and to use those classifications for internal capital allocations, for loan pricing, and for determination of loan loss reserves among other purposes. When the classification scheme is used for internal capital allocations, a probability of default, as well as a loss rate from default, is calculated for each loan. Regulatory agencies and central banks around the world are working on ways to use this same information as the raw material for the development of a much more accurate regulatory capital requirement. The purpose of capital, I need not remind you, is to absorb unexpected losses. At least in principle, a bank’s quantification of probabilities of default, and of loss rates given default, in combination with other information, allows both management and policymakers to determine how much capital is needed to cover unexpected losses within a certain minimum probability. Indeed, I believe that a consensus is developing among G10 countries around just such a use - that is to say, a capital accord in which the capital requirements for individual banks will vary with their individual credit risk profiles, based increasingly on the bank’s own internal risk evaluations. To be sure, there remains the problem of supervisory validation of these internal risk systems to ensure first, that the risk classifications are objective and reliable and, second, that they are also used by management for decisionmaking. No less critical is the tying of risk weights to internal risk classifications in such a way as to minimize inconsistencies of capital treatment among banks that have similar risks. From the work I’ve seen, these problems look solvable, at least in stages. Getting the numbers right is both a science and an art - and is critical. If we simply create a few more risk weights and buckets we will, I submit, have done no more than create new opportunities for capital arbitrage. In short, we will simply continue to induce banks to retain their risky assets when their own internal capital allocations exceed the regulatory levels and to sell, securitize, and otherwise shift off-balance-sheet those assets for which the regulatory capital requirement exceeds the economic requirement. The net result is likely to be riskier and less transparent banks - quite the opposite of what policymakers, supervisors, legislators and the public want. Regardless of what we do, and I cannot emphasize this enough, those banks on the frontier of risk management, small in number now but increasing, will continue along their current path of ever more sophisticated use of internal risk classifications. And whenever regulatory capital differs from economic capital by more than the cost of arbitrage, they will arbitrage. Another way of saying this is that regardless of our actions, frontier banks will always attempt to manage their businesses to earn competitive risk-adjusted rates of return on equity. Today, our capital regulation, with its one-size-fits-all risk weight for loans, encourages banks to withdraw from low-risk credit markets, or to arbitrage, when regulatory capital requirements exceed levels consistent with an activity’s underlying economic risk. Not only is this situation costly and inefficient for banks and their customers, but it also has become increasingly difficult for supervisors to assess the residual capital adequacy of LCBOs, as relatively low risk assets have been removed from the banking book. That is why we need a new regulatory capital framework, and why it is so critical that both the bank and the supervisor use capital weights that are as risk-sensitive as possible. Supervisors, of course, cannot simply take whatever banks are using in their internal risk classifications. Indeed, some large banks are, surprisingly, behind the curve in developing their own internal risk classifications. Their systems have too few categories, are based on insufficient historical data, have been subject to inadequate stress-testing, and are too simplistic. In mid-1999, the Federal Reserve told these banking organizations that they should catch up, and we required our examiners to explicitly evaluate these catch-up efforts in their examinations. I trust that these lagging banking organizations in their own self-interest will promptly revise their systems, both to meet coming revisions in the regulatory capital system and to avoid the market’s criticism as information about more institutions’ systems becomes better known to creditors of banking organizations. Market discipline Indeed, harnessing the market to assist in the process is critical to supervising global financial institutions. Reality requires that we emphasize that even with improvement in risk classifications and more accurate capital requirements, we have limited public-policy choices for large and complex organizations. Choice 1: we can accept systemic risk as a cost of having large, global organizations in the marketplace. Choice 2: in order to limit systemic risk, we can adopt very detailed regulation and supervision programs that include a growing list of prohibitions. Choice 3: we can rely more on market discipline to supplement capital reforms and can maintain a level of supervision similar to the one we have today. Given the choices, we simply must try market discipline - and its necessary prerequisite, public disclosure. Large, complex banking organizations already rely heavily on funding from sources other than insured depositors. These other creditors - including, but certainly not limited to, holders of subordinated debentures - should anticipate that the failure of the organization would, in a financial restructuring by the authorities, entail losses - at a minimum, significant haircuts. Fear of loss, if linked with the availability of sufficient information so that creditors are able to determine a bank’s real risk profile, should in turn induce uninsured creditors to behave like those of any nonbanking business. That is, uninsured creditors could be expected to command risk premiums linked to the portfolio risks and other risks of the organization. Such risk premiums should, in turn, act both as curbs on the risk-taking behavior of banking organizations and as supplementary signals to supervisors. But, if either effect is to materialize, the uninsured creditors must have both a credible fear of loss and the information about the individual institution necessary to make judgments and decisions. As most of you are aware, late last month the Fed, in cooperation with the Office of the Comptroller of the Currency and the Security and Exchange Commission, set up a private-sector advisory group. Composed of senior executives of banking and securities firms, the advisory group is to review the state of the art in public disclosure, to counsel us on best practices, and to suggest improvements in those practices. The group’s report will be public. While I have no idea what will be in the report, it is my hope and expectation that we will learn more about how to use market discipline both to strengthen our banking system and to avoid additional regulation and supervision of global financial institutions. At the Federal Reserve, we plan, however, to require that at least the large, complex banking organizations establish and implement a disclosure policy to provide stakeholders with information that can be used to evaluate the organization’s risk profile. Our examiners, as part of both the holding company inspection and the state member bank examination, will review and evaluate such disclosures for their conformance to best practices and their contribution to stakeholders’ understanding of their risk at that organization. We should all be aware that additional public disclosure is not a free good, especially if it works. Banks will find that additional market discipline constrains their options, and supervisors will be concerned about creditors’ response to bad news. But both constrained options and swift market punishment are part of the desired effect of market discipline. Supervision If, and I underline “if”, (1) banking organizations develop working, verifiable, and reasonably accurate methods of evaluating and categorizing credit risks (2) capital requirements are linked tightly to those risks (3) public disclosures induce realistic market discipline and (4) market, operational, and legal risks are under control, then the direction for supervisors seems reasonably clear: to validate systems, policies, and procedures. Now, I have purposely set up a straw man so that we can all appreciate how much work remains to be done. Despite the many tasks that lie ahead, the path that I believe we are on will, I think, lead to supervisory efforts that focus on a bank’s management information and risk-management systems and on providing management with evaluations and criticisms designed to improve those systems. To be sure, transactions-testing will remain an important effort. But, critically, the safety and soundness of the bank will depend on how well its risk-management systems work, the judgment its management brings to bear in using those systems, and the effectiveness of market discipline. It will, I think, increasingly be the job of the supervisor of global financial organizations to evaluate and test systems and to evaluate and criticize the accuracy and helpfulness of the information banks disclose about their own risk profiles. Internal systems and public disclosure, in short, are the real first line of defense. The only alternative for the large and complex banking organization, as I have noted, is intrusiveness and detailed regulation, which would dramatically reduce flexibility and innovation in our banking system. We have, I believe, already started down the path I have described. Several problems remain to be solved. Any one of them could slow or even stop our progress. I have already mentioned the need to develop procedures for validating risk classifications and for converting risk classifications into risk weights on an equitable basis across banks. The challenge of reaching a consensus at Basel is another obstacle. But whatever we develop, either here or on an international basis, we will be relying on the good judgment and sophistication of the nation’s examiners and on their development of the skills needed to keep pace with the activities of banks operating in the United States. Cooperation But skill and good judgment are, unfortunately, not sufficient. For better or worse, supervisors and policymakers function in a multi-agency environment. We must cooperate across agencies if we are going to get the job done. Most global US banks are supervised by the Office of the Comptroller of the Currency. All bank and financial holding companies and some large banks are supervised by the Federal Reserve. Nothing in this structure was changed by the recently enacted Gramm-Leach-Bliley Act. The challenges of supervising large, complex banking organizations raise yet again the question of how to make the supervisory structure mandated by the Congress work efficiently and in the public interest. All the parties, it seems to me, must work out relationships and operating norms that serve the objective of safe, sound, and efficient financial markets. The implicit tensions among the regulators are a fact of life; goodwill and cooperation are required if we are to carry out the law. Before I proceed further, it might help if I spend a moment on the philosophy underlying umbrella supervision and distinguish this supervisory approach from direct supervision of insured depository institutions. As you know, all large and sophisticated financial services companies manage their risks on a consolidated basis, requiring, in turn, oversight of risk-taking by the consolidated entity. The consolidated, or umbrella, supervisor aims to keep the relevant regulators informed about overall risk-taking and to identify and evaluate the myriad risks that extend throughout such diversified bank and financial holding companies in order to judge how the parts and the whole affect, or may affect, affiliated banks. To fulfill its responsibility, reaffirmed by the recent legislation, the Federal Reserve plans to focus on the organization’s consolidated risk-management process and on overall capital adequacy. For the new financial holding companies, the consolidated capital issue is complicated by the affiliation of banks with institutions that have their own financial regulator and capital regulation. We are in the process of tackling these issues, knowing that responsibility for ensuring adequate management processes and control relies, in the first instance, with a bank’s management and its primary supervisor. As umbrella supervisor, the Federal Reserve seeks to gain an overview of the organization’s activities and to detect potential threats to affiliated US depository institutions. The role of a financial and bank holding company supervisor is significantly different from that of a bank supervisor. The difference reflects the difference between an insured depository institution and a nonbank affiliate of the holding company. Depository institutions are covered by the federal safety net - deposit insurance and access to the discount window and to other guarantees associated with the Federal Reserve’s payment and settlement system. Access to the federal safety net dampens the incentive of investors and creditors to monitor banks’ risk-taking, which in turn breaks the link between bank risk-taking and funding costs. Bank regulation and supervision aims to compensate for the resultant breakdown in market discipline and to limit bank failures that could overwhelm the deposit insurance fund. The financial modernization law did not change the focus of the safety net. But the relative growth of activities in bank holding companies outside the insured depository institution, as well as the increased focus by both management and supervisors on consolidated risk management, may make maintaining the distinction between the insured bank and its increasingly nonbank affiliates more challenging. If we let public perceptions, let alone supervisory actions, blur the distinction, we will surely extend the implicit safety net and expand its moral hazard, to the detriment of efficient markets and, ultimately, at high cost to taxpayers. The recently enacted law provides that, when specialized functional regulators already oversee the new permissible activities, duplication of supervision, and hence excessive regulatory burden, should be avoided. In addition, because market discipline operates more effectively in connection with nonbank activities not subject to the moral hazard of the safety net, regulators should try to avoid diminishing market discipline in the new financial holding companies. Thus, the act discourages the extension of bank-like regulation and supervision to nonbank affiliates and subsidiaries. The Federal Reserve can contribute to this goal by being clear in word and deed that the affiliation of nonbank entities with a bank does not afford them access to the safety net. However, the Congress also saw the need for an umbrella supervisor to protect insured depository institutions from the risks of activities conducted by bank holding company affiliates. The law limits the extension of credit by insured depository institutions to their affiliates, and the umbrella supervisor - the Fed - is charged with limiting other forms of risk exposure to the depository institutions from the bank holding company structure. Clearly, there is a tension between protecting banks from such risks and avoiding the extension of bank-like supervision to affiliates. The provisions of the law dealing with the relationship between the Federal Reserve and the functional supervisors of certain types of nonbank affiliates - the SEC, the Commodity Futures Trading Commission, and the state insurance regulators - attempt to balance these considerations. As you know, these provisions call for the Federal Reserve to rely, as much as possible, on the examinations conducted by the functional supervisors and on public reports to obtain information about broker-dealers, insurance companies, and futures merchants. The Federal Reserve may examine such functionally regulated entities only if (1) the Board has reasonable cause to believe that the entity is engaged in activities that pose a material risk to an affiliated depository institution, (2) the Board determines that an examination is necessary to inform the Board of the entity’s risk-management systems, or (3) the Board has reasonable cause to believe that the entity is not in compliance with the banking laws. We are in the process of working out satisfactory procedures with functional regulators. But it seems to me that we also must work harder to cooperate and share information among the umbrella and bank supervisors in a manner that is satisfactory to both and that minimizes regulatory burden and overlap. In principle, the relationship between the umbrella supervisor and the primary federal bank regulator could involve the relationship between the Federal Reserve and either the Federal Deposit Insurance Corporation or the OCC. In practice, however, the key relationship for large, complex financial holding companies will be between the Federal Reserve and the OCC because the banks in large, complex financial holding companies are either state member banks or national banks. Indeed, most of the large and complex institutions likely to take advantage of the new opportunities have lead banks, as I have noted, with national charters. This relationship between the primary bank regulator and the umbrella supervisor must respect the agencies’ individual statutory authorities and responsibilities. At the same time, the primary bank regulator and the umbrella supervisor need to share information that allows them to carry out their responsibilities without creating duplication or excessive burden. Given the systemic risk associated with the disruption of the operations of large banks - and the role of the bank within the broader banking organization - the Federal Reserve believes that it needs to know more about the activities within large insured depository institutions than can be derived from access to public information or from the reports of the primary bank supervisor. Similarly, the primary bank regulator needs information about the activities of a bank’s parent company and its nonbank affiliates aimed at protecting the bank from threats that might arise elsewhere in the consolidated organization. The need is particularly pressing when companies manage their businesses and attendant risks across legal entities within the structure of a financial holding company. As I have noted, the result is a complicated relationship, one with unavoidable, inherent, tensions. We each have our specific statutory responsibilities - the primary bank regulator for the bank and the Fed for the consolidated holding company. Yet to be most effective we need to work cooperatively and to keep each other informed. This cooperation should, when necessary, include participation in each other’s examination teams. The bottom line is that the primary bank regulator and the Federal Reserve as umbrella supervisor should establish practical operating arrangements to ensure that the relationship avoids duplication, minimizes regulatory burden, respects individual responsibilities, and still ensures the wider flow of information required to meet their individual and collective responsibilities. There are, I am pleased report, ongoing discussions between the Federal Reserve and the OCC focused on improving our cooperation and coordination where we are both involved in the supervision of individual LCBOs. In many cases today, the existing relationships and coordination between Federal Reserve and OCC examiners are already excellent. Working collaboratively, we will assess our coordination at several LCBOs to draw lessons from those cases where the relationship is already working very well. We will use this information to improve the consistency of our relationship across all the LCBOs where we are both involved in supervision. In addition to improved cooperation among US banking agencies, supervision of global financial institutions requires strong relations among supervisors worldwide. Most certainly, we have sought to do that, for decades now, through the Basel Committee on Banking Supervision and its predecessor. Although the work of that committee has focused on banks in G10 countries, the supervisory principles and sound banking practices that it has identified have helped to strengthen bank supervision around the globe. Development of the “Core Principles for Effective Banking Supervision”, is a prime example of those efforts. The creation of the Financial Stability Institute, under the joint sponsorship of the Bank for International Settlements and the Basel Committee on Banking Supervision, is another and represents an important effort to help developing countries train their supervisory staff. The need for international cooperation extends beyond banking systems and bank supervisors, however, and must embrace the full range of regulated activities that large, complex financial institutions conduct. Toward that end, authorities from around the world have established the Joint Forum, made up of representatives of agencies regulating banking, insurance, and securities activities. The Financial Stability Forum, established by the G7 in 1999, is another effort to promote international financial stability through information exchange and cooperation in financial supervision. The Forum regularly brings together national authorities responsible for financial stability in significant international financial centers - including both securities and banking supervisors international financial institutions, and representatives of international groups of supervisors and regulators. Other groups exist and will, necessarily, be created to address issues of specific interest and may have short or long lives. The point is, it is important that we communicate and coordinate our activities, so that we understand each other’s responsibilities and oversight techniques. As international problems emerge - as they will - knowing our counterparts abroad and trusting their judgment could be essential to resolving problems in a timely and orderly way. Conclusion The challenge of supervising global financial institutions is the challenge of the decade for supervisors. Large banking organizations are likely to become increasingly complicated and wideranging, and the banking supervisory agencies will have to adjust to that. In my view, the adjustment will require increasing reliance on banks’ own internal risk management, and especially on internal risk classification systems; on regulatory capital linked to internal risk classifications; on supervision that focuses on evaluation of, and supervisory feedback on, risk-management systems; on market discipline; and on increased cooperation among agencies. None of these steps will be easy. The good news is that we’ve started on all of these efforts, and that progress has already been made.
|
board of governors of the federal reserve system
| 2,000 | 6 |
Speech by Mr Laurence H Meyer, Governor of the Board of Governors of the US Federal Reserve System, before the Boston Economics Club, Boston, on 6 June 2000.
|
Laurence H Meyer: The New Economy meets demand Speech by Mr Laurence H Meyer, Governor of the Board of Governors of the US Federal Reserve System, before the Boston Economics Club, Boston, on 6 June 2000. * * * I often draw the themes for my talks from the questions I hear about the intersection of the economic outlook and monetary policy. This evening, I begin with two questions that are central to the economy’s prospects and the challenges facing monetary policy. First, Is there a new economy? And second, What role, if any, do traditional economic principles, specifically the role of supply and demand, continue to play in today’s economy? Before I proceed to those questions, I want to emphasize why the answers matter. It almost - and I say, almost - goes without saying. Nevertheless, I can’t stress too often that we care about the balance of supply and demand in the economy because we care about promoting both full employment and price stability and, thereby, maximum sustainable growth. We want to contain inflation because doing so has been crucial for sustaining the economic expansion that we now enjoy and for providing an environment conducive to private decision-making and longer-term planning so critical for taking advantage of new technological opportunities. Containing inflation has, I am sure, contributed to the length and strength of the economic expansion we now enjoy - an expansion, by the way, that is the longest in our nation’s history. And I need not remind you that the low inflation we now have was dearly purchased in the late 1970s and early 1980s with the highest interest rates since the Civil War and the highest unemployment rate since the Depression. Precisely because inflation is the critical issue that hangs in the balance of new economy possibilities and old economy regularities, I will offer some observations on how I read the recent data on labor compensation and price inflation. My comments are in the spirit of inflation reports that many central banks with explicit inflation targets regularly issue. Before proceeding, let me remind you that the views expressed on the outlook and on monetary policy are my own. I am not speaking on behalf of the Board of Governors or the Federal Open Market Committee. Is there a New Economy? So, is there a “new economy”? The answer is: it depends. It depends on how you define new economy, and it depends on where you live. There are broader and narrower definitions of the new economy. The narrow version defines the new economy in terms of two principal developments: first, an increase in the economy’s maximum sustainable growth rate and, second, the spread and increasing importance of information and communications technology. The latter is presumably the major contributor to the acceleration in labor productivity that, in turn, is the principal source of the increase in trend growth in real GDP. A third, and perhaps related, development is a possible increase in the economy’s sustainable utilization rates, specifically a decline in the non-accelerating-inflation rate of unemployment (NAIRU). Our laboratory for the new economy is the United States, given that there is very little evidence outside the United States for even this narrow definition of the new economy. In the case of the United States, however, there is little doubt that the underlying rate of productivity growth has increased significantly in the second half of the 1990s. From 1974 to 1995, labor productivity advanced at about an annual rate of 1½%. Productivity then accelerated to a rate of about 2½% in the second half of the 1990s. This acceleration appears to have been spread out over the second half of the 1990s, so that the average rate over that period understates the rate of productivity growth at the end of the period. Productivity typically grows faster than its longer-term trend when GDP growth is rising and falls below trend when GDP decelerates. This pattern simply reflects lags in adjusting employment to changes in GDP growth. Measuring productivity growth over a long period, such as 1974 to 1995, effectively eliminates this shorter-run component of productivity growth. And because GDP growth was relatively stable during the second half of the 1990s, shorter-run dynamics appear not to have been an important contributor to the higher productivity growth in that period. Moreover, careful econometric attempts to isolate the short-run dynamic and longer-run structural components generally have concluded that structural productivity growth increased from about 1½% in the earlier periods to around 2½% to 3% by the end of the decade. That would put the sustainable rate of GDP growth up to 3½% to 4%. Still there is considerable uncertainty about trend productivity growth, including whether it might be accelerating, especially given the brief period over which higher and rising structural productivity growth has been experienced. Important questions about the measurement of productivity aggravate this uncertainty. And there is also considerable uncertainty about how long the higher productivity growth will persist. For example, periods of more rapid productivity growth might be best understood as a transition to a higher level of productivity that is based on major technological developments. The persistence question is more important for assessing longer-run fiscal prospects - including the solvency of Social Security - than to monetary policy decisions that are made in the context of a one to two year period. Using the neoclassical model, and disaggregating capital into information and communications technology and other capital, Dan Sichel and Steve Oliner of the Board staff decomposed productivity growth into contributions from capital deepening (the growth arising from an increase in the ratio of capital to labor) and multifactor productivity growth (the growth in output that cannot be accounted for by increases in labor and capital inputs) and into the contributions from the use of information technology and from increased efficiency in the production of computers. According to their estimates, a bit less than half of the productivity acceleration was due to a pickup in capital deepening and a bit more than half to an increase in multifactor productivity growth. More than 90% of capital deepening came from information and communications technology equipment, and nearly 40% of the increase in multifactor productivity growth came from increased efficiency in the production of computers and embedded semiconductors. Altogether, therefore, information and communications technology accounted for slightly more than two-thirds of the increase in productivity.1 Besides the direct effects of information and communications technology through capital deepening and the more efficient production of computers, this technology may also indirectly raise productivity through spillover effects. If the use of information and communications technology generates externalities throughout the economy - for example, through new efficiencies from e-commerce - the overall efficiency in production will increase. In a traditional growth accounting setup, these effects would show up in multifactor productivity growth. Evidence of spillovers is extremely sparse. Some back-of-the-envelope calculations by Oliner and Sichel suggest that such effects have been quite small to date, though the explosive growth of e-commerce, particularly in the business-to-business segment, suggests a potential for a more important contribution over time. But do these developments - specifically higher trend productivity growth and the spread of information and communications technology - alone justify the “new economy” label? We could, for example, explain recent US economic performance in terms of “new parameters in the old paradigm”. For papers on recent productivity performance and attempts to separate cyclical and trend components and the role of capital deepening and multifactor productivity, and to measure the contribution from information and communications technology, see Robert J Gordon, “Does the New Economy Measure Up to the Great Inventions of the Past,” Journal of Economic Perspectives (forthcoming); Dale W Jorgenson and Kevin J Stiroh, “Raising the Speed Limit: US Economic Growth in the Information Age,” 1 May 2000; Macroeconomic Advisers, “Productivity and Potential GDP in the ‘New’ US Economy,” September 1999; and Stephen D Oliner, and Daniel E Sichel, “The Resurgence of Growth in the Late 1990s: Is Information Technology the Story?” working paper, Federal Reserve Board, February 2000. Specifically, we could increase the estimate of trend productivity growth, based on higher multifactor productivity and capital deepening - both due in large part to information and communications technology - and have a fairly good explanation of the remarkable performance of the US economy. This approach would explain the recent productivity performance without denying the continued relevance of old economy regularities, including the role of supply and demand imbalances as a source of inflation dynamics. The alternative - and the broader interpretation that often seems to underlie the new economy label - is that we are witnessing a more fundamental change in the paradigm. The old rules no longer apply. Throw out the NAIRU. Heck, throw out supply and demand. No limits, no business cycles. All right, this is a bit of an exaggeration, but you get the point that I am not especially partial to the broader interpretation of the new economy concept! Still, to be fair, there are other potential and perhaps more far-reaching implications of the spread of information and communications technology, including the role of the Internet and e-commerce. Today, these are, in my view, best expressed as questions about future prospects rather than as principles underlying the present economy. For example, do these developments increase the competitiveness of markets, and, if so, how does this affect inflation dynamics? They appear to increase the speed and effectiveness of price discovery. What does this imply for pricing leverage and inflation dynamics? Do they contribute to a permanent increase in sustainable utilization rates, perhaps by increasing the efficiency of the matching of available workers with available jobs? Do they result in rapidly growing sectors dominated by increasing returns to scale, where increases in demand lower cost and hence prices? These are all provocative and important questions, but none of these developments, in my view, are powerful enough at this moment to support the notion that labor and other utilization rates can rise ever higher without triggering accelerating prices - the broader version of the “new economy”. So, is there a new economy? As I said, it depends. For my part, I accept the proposition that there has been a significant improvement in underlying productivity growth in the United States, that it is very closely tied to improvements in information and communications technology, and that it is likely to spread around the world. But I resist the new economy label because it seems to encourage a disrespect for the old rules that could seriously undermine our success in taking advantage of the new opportunities. This brings me to my second topic. Welcome back supply and demand I was startled by the bold title of an article that appeared in The Wall Street Journal on 31 December 1999: “So Long, Supply and Demand”. But it illustrates the unbounded optimism - some might even call it irrational exuberance - about economic prospects and a willingness to abandon time-tested economic principles that offer cautions and imply constraints on economic opportunities. I was rather certain that confidence in supply and demand would make a comeback, and so I was delighted to see the front-page story in The Wall Street Journal on 16 May 2000 - the day of the last FOMC meeting. The title this time was “Firms Start Raising Prices, Stirring Fears of Inflation Fighters,” and it began: “Even in the new economy, at least one old rule still applies: if demand exceeds supply for long enough, sellers will raise prices.” So let me count the ways that supply and demand help us to understand the recent experience and the challenges facing monetary policy today. First, a productivity shock affects aggregate demand as well as potential supply and may initially have an even larger effect on demand than on supply. In early discussions about the productivity shock, the emphasis was, not surprisingly, exclusively on its supply-side implications - specifically a faster rate of productivity growth and hence of sustainable GDP growth. The natural corollary seemed to be that a faster growth of supply than of demand would be a powerful disinflationary force. But during the period over which productivity has accelerated, demand has grown faster than potential supply. The demand effects - to the extent that they are directly related to the productivity shock likely reflect the more favorable investment opportunities, the effect of expected profitability on equity prices and hence household wealth and consumption, and the effect of the increase in expected future labor income on current consumption. Demand, it appears, received an additional boost over this period from a run-up in equity prices that the higher productivity growth alone could not fully account for. The balance between supply and demand can be inferred from movements in utilization rates, specifically in the unemployment rate. When actual output is expanding at the same pace as potential, the unemployment rate will be stable. When output growth outpaces the growth of potential, the unemployment rate declines. And the unemployment rate has declined almost 0.4 percentage points a year for the past four years. This translates into excess demand growth of 0.75 to 1 percentage point relative to potential supply growth. The second insight - and enduring old economy wisdom - is that a proximate source of changes in inflation is an imbalance between the levels of aggregate supply and aggregate demand. This can be expressed as an imbalance between actual and potential output or as a divergence of the unemployment rate from the NAIRU. The imbalance between the growth rates of aggregate supply and demand is, of course, the source of changes in the balance between the levels of aggregate demand and supply. But inflation is related directly to the levels not to the growth imbalance. And, even in the new economy, excess aggregate demand ultimately drives up inflation. Thus the limits may have changed, but the consequences of overtaxing the limits remain the same. Do we have excess aggregate demand? In my judgment, we have excess demand conditions in the labor market. The central tendency for my estimate of the NAIRU is in the range of 5% to 5¼%, compared to the 4.1% current unemployment rate. This estimate is consistent with most large-scale macroeconometric models and with the estimates of the NAIRU that underlie the economic and budget projections of both the Council of Economic Advisers and the Congressional Budget Office, but there is nonetheless legitimate uncertainty about the estimate of the NAIRU. This uncertainty has been, in my view, an important consideration in the way monetary policy has responded to recent economic developments. Obviously, whether the NAIRU is closer to 4% or to 5% affects the difficulty associated with rebalancing supply and demand to contain the risk of higher inflation. But why has inflation remained moderate if there is persistent excess demand in the labor market? This is still another supply and demand story. The economy is subject to two fundamental types of aggregate economic shocks: supply shocks and demand shocks. These two types of shocks give rise to different challenges for monetary policy. Supply shocks come in two varieties: relative price shocks (such as changes in the relative price of oil) and productivity shocks. Earlier in this episode, the economy benefited from a series of favorable relative price shocks and, throughout the last several years, has been adjusting to an increase in productivity growth. Both of these developments have had a temporary disinflationary effect. Together they suppressed inflation for a while, countering the potential inflationary consequences of the progressive increase in aggregate demand relative to potential supply. Once the disinflationary impetus from supply shocks begins to dissipate or to reverse, the inflationary consequences of the supply-demand balance will begin to show through. The disinflationary effect of an increase in productivity growth begins to dissipate once productivity growth stabilizes at a higher level. So unless productivity accelerates further, its disinflationary effect should continue to erode for a time. When favorable supply shocks dominate, growth in demand is stimulated and utilization rates rise, but inflation tends to moderate. The result is offsetting implications for the setting of the nominal funds rate and, thus, monetary policy may be left with little work to do. This accounts for the relative inactivity of monetary policy from 1996 through the end of 1999, at which point the federal funds rate was within 1/4 percentage point of where it was at the beginning of the period. But once the disinflationary effects of the favorable supply shocks dissipate or reverse, the challenge is more like one that accompanies demand shocks. Excess demand, evidenced by utilization rates above sustainable levels, will put upward pressure on inflation, and monetary policy must restrain aggregate demand to bring it into balance with potential supply to avoid rising inflation. A brief inflation report But is there any evidence that inflation pressures are in fact building? Of course, overall inflation has clearly increased significantly over the last year. The consumer price index, for example, has increased at a 3% rate over the last twelve months, compared with a 2.3% rate over the previous twelve months. Similar trends are evident in the PCE and in the GDP price index. But this increase in overall inflation reflects mainly the rise in oil prices over 1999 and into 2000. Assuming that oil prices stabilize, the effect will dissipate, and overall inflation will return toward, and indeed dip slightly below, the core rate (the rate net of food and energy prices). So, looking forward, the core inflation rate is the more important consideration. The core CPI advanced at a 2.2% rate over the last twelve months, a rate equal to that over the previous twelve months and only about 1/4 percentage point above the cyclical low reached in January. These numbers suggest that inflation pressures remain well contained. But digging a little deeper, the evidence, in my judgment, supports the conclusion that core inflation has moved modestly higher over the last six to nine months. First, the introduction of a methodological change in measuring the CPI in January 1999 lowered CPI inflation relative to the earlier period. As a result, on a methodologically consistent basis, core CPI inflation in the last twelve months has actually been up a couple tenths of a percentage point. But, more important, the higher-frequency data provides some evidence of a rising trend in core inflation. For example, at an annual rate, core CPI inflation is 2.4% over the last nine months, 2.5% over the last six months and 3.2% over the last three months. So I conclude that the underlying trend for core CPI inflation has moved up to close to 2½% today. The core PCE and the GDP price indexes also have accelerated over the last six to nine months. The core PCE index increased at a 1.4% rate over 1999. Over the last year the rate was 1.6%, over the last six months 1.9%, and over the last three months 2.4%. The higher core inflation could be explained by a pass-through to the core of earlier increases in oil prices. However, whether such a pass-through leads to a one-time increase in the price level or to continuing inflation depends on whether or not monetary policy accommodates the higher inflation. Whether such accommodation occurs, in turn, depends on how policy deals with the excess demand that will be felt in the first instance in wage pressures in a very tight labor market. I therefore turn to an assessment of the pressures coming from labor compensation. Here the data are even more confusing. For example, consider the trend in year-over-year growth rates for the three key measures. For the employment cost index, the trend is decidedly up; for average hourly earnings, however, the year-over-year growth rate has been flat; and for compensation per hour in the productivity and costs report, the trend is actually down. Again, we need to dig a little deeper, but this excavation will not allow us to reach a definitive judgment from this extraordinarily mixed set of indicators. Until the May employment report, the recent monthly data clearly pointed to an acceleration in average hourly earnings - given the 4½% rate posted over the first four months of the year after a 3½% rate over 1999. But the unexpectedly small increase in May left the year-over-year increase in average hourly earnings at just 3.5%, about the same as over the previous twelve months. Year-to-date, average hourly earnings has increased at a 3.8% rate - still an acceleration, but one that is far less definitive than that based on the data through April. There will be considerable interest in the next report for further evidence on the degree of upward trend in this measure. There are, in my judgment, some grounds for discounting the productivity and cost measure. During the last benchmark revision, this measure was adjusted up sharply. I will have more confidence in the recent data for this measure of labor compensation if the deceleration remains intact after the next revision. In addition, this measure tends to use trends instead of real-time data for benefit costs although the Bureau of Economic Analysis does adjust the trends judgmentally in response to the real-time ECI data on benefit costs. Lately, the benefit component of the ECI has rebounded sharply. Even if the evidence for an acceleration in nominal labor compensation were more definitive, the implications for inflation are not altogether straightforward. If the trend in the growth of labor compensation is upward, it could be a response to the uptick in overall inflation last year or to overly tight labor markets or to a catch-up to the higher rate of productivity growth. Just as the slowing in overall inflation in 1997 and 1998 contributed to a moderation in nominal wage demands, the higher overall inflation in 1999 and 2000 would be expected to boost nominal wage demands. But any rebound in nominal labor compensation could also reflect a catch-up to higher productivity growth. If nominal compensation is just matching the higher productivity growth, this source of acceleration in nominal compensation would not itself be inflationary. But there is an important caveat here. The slow initial response of nominal compensation to higher productivity growth is the source of the temporary disinflationary effect of a productivity shock. Therefore, once the catch-up is under way, this disinflationary impetus gradually disappears. And at this point, the pass-through from higher inflation and the effect of tight labor markets have no offset and will begin to dominate. So even the catch-up story plays a role in the upward trend in inflation. A second reason that nominal compensation is so difficult to factor into an inflation forecast is that compensation practices are changing. For example, our measurement has not caught up with the increased importance of stock options. Stock options are incorporated, based on gains upon exercising the options, in the productivity and cost measure, but not in the ECI. In addition, many ways in which firms are recruiting and retaining workers - such as in-house fitness and child care centers, flexible hours, educational assistance, on-site personal services, and in-kind payments - are not reflected in compensation measures (although hiring and referral bonuses will be included in the ECI in the next release). Finally, the growing importance of variable pay and of temporary workers may have important implications for wage dynamics that are not fully understood. So what is the outlook for inflation, and how does it relate to the interplay of new economy forces and traditional supply and demand considerations? In my judgment, we took the benefits of both the earlier favorable relative price shocks and the productivity shock, partly in a decline in the unemployment rate below the NAIRU and partly in a decline in inflation. This is not a statement about what policymakers planned, but rather about what evolved as we responded to unexpected developments in inflation and growth. At any rate, we could have taken more of the benefits of the favorable supply shocks in lower inflation, but given that inflation was already so low, the combination we ended up with seems, after the fact, to have been reasonable. At some point, however, when the temporary disinflationary impetus of the favorable supply developments dissipate, not only will there be some rebound in inflation, but unless a transition is made back to sustainable utilization rates, there will be a risk of a continuous upward movement in inflation. During that transition, at least some of the earlier decline in core inflation will be reversed. To be sure, it has been difficult to be precise about both sustainable utilization rates and the path of inflation because of uncertainties about the NAIRU and other aspects of inflation dynamics in a period of significant structural change. But I believe the qualitative story that I have set out is the right one. Given our uncertainty about sustainable utilization rates and wage-price dynamics in the new economy, however, policy setting must remain flexible and responsive to new information about both the supply and the demand sides of our economy. Conclusion: the challenge facing monetary policy This analysis suggests that monetary policy does face a challenge - rebalancing aggregate supply and demand to contain the risk of higher inflation. I believe that we have been moving effectively to get this job done. The major question in this respect is whether slowing the economy to trend alone will get the job done or whether we need a period of below-trend growth to unwind an imbalance between the levels of aggregate demand and supply. If the task is only slowing the economy to trend - because the NAIRU turns out to be close to 4% - the task is not as challenging, and inflation will remain stable near current levels. If the NAIRU turns out to be closer to 5%, then the task is more demanding, and growth will have to slow to below trend for a while, and inflation is likely to rise somewhat further until the rebalancing is complete. If successful, in either scenario, the payoff from monetary restraint will be both to contain the risk of higher inflation and to extend the life of this remarkable expansion. Several considerations provide some optimism that the outcome will be a benign one - a soft as opposed to a hard landing. First, we are now in a high-growth rather than a low-growth economy. Even if we have to slow growth to below trend for a period, the resulting growth rate could remain well above the average growth rate over the previous 25 years and still get the job done. Second, supply forces could smooth the transition. If oil prices have now at least peaked - and, better yet, if they decline at least modestly over the next year and a half, as suggested by expectations reflected in futures markets - the upward impetus to overall inflation from oil prices will dissipate or even reverse. In this case, overall inflation is likely to decline next year, and this decrease could help moderate the rise in core inflation into the following year. Third, long-term inflation expectations remain firmly anchored, reflecting considerable confidence that monetary policy will contain any threat of higher inflation. This should damp the rise in inflation in the short term. Fourth, monetary policy got a head start on containing inflation by beginning to tighten last June, before the signs of building inflation pressures were evident. Fifth, the tighter monetary policy is now contributing to a less accommodative set of financial conditions throughout the economy - including higher short- and long-term private interest rates, lower equity prices, a stronger dollar, and more stringent lending conditions at banks. If these tighter financial conditions remain in place, we will have made significant progress in establishing the foundation for slower growth. References Gordon, Robert, J, “Does the New Economy Measure Up to the Great Inventions of the Past”, Journal of Economic Perspectives (forthcoming). Jorgenson, Dale W, and Kevin J Stiroh, “Raising the Speed Limit: US Economic Growth in the Information Age”, 1 May 2000. Macroeconomic Advisers, “Productivity and Potential GDP in the ‘New’ US Economy”, September 1999. Oliner, Stephen D, and Daniel E Sichel, “The Resurgence of Growth in the Late 1990s: Is Information Technology the Story?” working paper, Federal Reserve Board, February 2000.
|
board of governors of the federal reserve system
| 2,000 | 6 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the National Governors' Association, 92nd Annual Meeting, held at State College, Pennsylvania, on 11 July 2000.
|
Alan Greenspan: Structural change in the new economy Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the National Governors’ Association, 92nd Annual Meeting, held at State College, Pennsylvania, on 11 July 2000. * * * I am pleased to have the opportunity to meet with you today and address the remarkable changes that have been occurring in our economy. The current economic expansion has not simply set a new record for longevity. More important, the recent period has been marked by a transformation to an economy that is more productive as competitive forces become increasingly intense and new technologies raise the efficiency of our businesses. With the rapid adoption of information technology, the share of output that is conceptual rather than physical continues to grow. While these tendencies were no doubt in train in the “old”, pre-1990s economy, they accelerated over the past decade as a number of technologies with their roots in the cumulative innovations of the past half-century began to yield dramatic economic returns. As governors of our states, you have all been dealing with the practical effects of these shifts, which not only have increased prosperity but also are presenting important new challenges. The process of innovation is, of course, never ending. Indeed, the substitution of physical capital, in which new technologies are embodied, for manual labor is an ongoing trend that began nearly two centuries ago when work in craft shops shifted to factories and then to assembly lines. However, the development of the transistor after World War II appears in retrospect to have initiated a special wave of creative synergies. It brought us the microprocessor, the computer, satellites, and the joining of laser and fiber optic technologies. By the 1990s, these and a number of lesser but critical innovations had fostered an enormous new capacity to capture, analyze and disseminate information. Indeed, it is the proliferation of information technology throughout the economy that makes the current period appear so different from preceding decades. This remarkable coming together of technologies that we label IT has allowed us to move beyond efficiency gains in routine manual tasks to achieve new levels of productivity in now routine information-processing tasks that previously depended upon people to compute, sort and retrieve information for purposes of taking action. As a result, information technologies have begun to alter significantly how we do business and create economic value, often in ways that were not foreseeable even a decade ago. One result of the more rapid pace of IT innovation has been a visible acceleration of the process that noted economist Joseph Schumpeter many years ago termed “creative destruction” - the continuous shift in which emerging technologies push out the old. Today our capital stock is undergoing an increasing pace of renewal through investment of cash flow from older technology capital equipment and facilities into cutting-edge, more efficient vintages. This process of capital reallocation across the economy has been assisted by a significant unbundling of risks in capital markets made possible by the development of innovative financial products, many of which themselves owe their viability to advances in technology. At the microeconomic level, the essential contribution of information technology is the expansion of knowledge and its obverse, the reduction of uncertainty. Before this recent quantum jump in information availability, businesses had limited and less timely knowledge of customers’ needs and of the location of inventories and materials flowing through complex production systems. In that environment, decisions were based on information that was hours, days or even weeks old. Businesses, to protect production schedules, found it essential, although costly, to carry sizable backup stocks of materials and to keep additional persons on their payrolls for making the necessary adjustments to the inevitable miscalculations and unanticipated shifts in demand for their products and services. Of course, a great deal of imprecision persists, but the remarkable surge in the availability of real-time information has enabled businesses to reduce unnecessary inventory and dispense with labor and capital redundancies. Intermediate production and distribution processes, so essential when information and quality control were poor, are being bypassed or eliminated. There are no indications in the marketplace that the process of re-engineering business operations is slowing, although it has been difficult analytically to disentangle the part of the rise in output per hour that is permanent and that which is the consequence of transitory business cycle forces. The part based on information advances, of course, is irreversible. Having learned to employ bar code and satellite technologies, for example, we are not about to lose our capability in applying them. But until we experience an economic slowdown, we will not know for sure how much of the extraordinary rise in output per hour in the past five years is attributable to the irreversible way value is created and how much reflects endeavors on the part of our business community to stretch existing capital and labor resources in ways that are not sustainable over the longer run. I have stressed information technology’s crucial role on the factory floor and in distribution channels. But technological innovation has spread far beyond that. Biotechnology is revolutionizing medicine and agriculture in ways that were unimaginable just a few years ago, with far-reaching consequences for the quality of life not only in the United States but also around the world. Even more intriguing are those as yet unrealized opportunities for computers and information technology to expand our scientific knowledge more generally. As I indicated earlier, the major contribution of advances in information technology and their incorporation into the capital stock has been to reduce the number of worker hours required to produce the nation’s output, our proxy for productivity growth. Echoing a debate that is as old as Adam Smith, some view this so-called labor displacing investment and the introduction of innovative production processes as a threat to our economy’s capacity to create new jobs. But because technological change spawns so many opportunities for businesses to expand, the introduction of new efficiencies has today, as in the past, created a vibrant economy in which opportunities for new jobs and businesses have blossomed. An intriguing aspect of the recent wave of productivity acceleration is that US businesses and workers appear to have benefited more from the recent advances in information technology than their counterparts in Europe or Japan. Those countries, of course, have also participated in this wave of invention and innovation, but they appear to have been slower to exploit it. The relatively inflexible and, hence, more costly labor markets of these economies appear to be a significant part of the explanation. The elevated rates of return offered by the newer technologies in the United States are largely the result of a reduction in labor costs per unit of output. The rates of return on investment in the same new technologies are correspondingly less in Europe and Japan because businesses there face higher costs of displacing workers than we do. Here, labor displacement is more readily countenanced both by law and by culture. Parenthetically, because our costs of dismissing workers are lower, the potential costs of hiring and the risks associated with expanding employment are less. The result of this significantly higher capacity for job dismissal has been, counterintuitively, a dramatic decline in the US unemployment rate in recent years. But one less welcome by-product of rapid economic and technological change, and the necessary heightened level of potential job dismissal that goes with it, is the evident insecurity felt by many workers despite the tightest labor markets in decades. This anxiety stems, I suspect, from a fear of job skill obsolescence, and one very tangible measure of it is the pressure on our education and training systems to prepare and adapt workers to effectively run the new technologies. These pressures are likely to remain intense, even though they may wax and wane, because I see nothing to suggest that the trends toward a greater conceptual content of our nation’s output and, thus, toward increased demand for conceptual skills in our workforce, will end. The rapidity of innovation and the unpredictability of the directions it may take imply a need for considerable investment in human capital. Even the most significant advances in information and computer technology will produce little additional economic value without human creativity and intellect. The heyday when a high school or college education would serve a graduate for a lifetime is gone; basic credentials, by themselves, are not enough to ensure success in the workplace. Today’s recipients of diplomas expect to have many jobs and to use a wide range of skills over their working lives. Their parents and grandparents looked to a more stable future - even if in reality it often turned out otherwise. Workers must be equipped not simply with technical know-how but also with the ability to create, analyze and transform information and to interact effectively with others. Moreover, learning will increasingly be a lifelong activity. Certainly, the notion that human and physical capital are complements is not new. Technological advance has inevitably brought with it improvements not only in the capital inputs used in production but also new demands on workers who must interact with that increasingly more complex stock of capital. Early in this century, these advances required workers with a higher level of cognitive skills, for instance the ability to read manuals, to interpret blueprints or to understand formulae. Our educational system responded: in the 1920s and 1930s, high school enrollment in this country expanded rapidly, pulling youth from rural areas, where opportunities were limited, into more productive occupations in business and broadening the skills of students to meet the needs of an advancing manufacturing sector. It became the job of these institutions to prepare students for work life, not just for a transition to college. In the context of the demands of the economy at that time, a high school diploma represented the training needed to be successful in most aspects of American enterprise. The economic returns for having a high school diploma rose and, as a result, high school enrollment rates climbed. At the same time, our system of higher education was also responding to the advances in economic processes. Although many states had established land grant schools earlier, their support accelerated in the late 19th century as those whose economies specialized in agriculture and mining sought to take advantage of new scientific methods of production. Early in the 20th century, the content of education at an American college had evolved from a classically based curriculum to one combining the sciences, empirical studies and modern liberal arts. Universities responded to the need for the application of science - particularly chemistry and physics - to the manufacture of steel, rubber, chemicals, drugs, petroleum and other goods requiring the newer production technologies. Communities looked to their institutions of higher learning for leadership in scientific knowledge and for training of professionals such as teachers and engineers. The scale and scope of higher education in America was being shaped by the recognition that research - the creation of knowledge - complemented teaching and training - the diffusion of knowledge. In a global environment in which prospects for economic growth now depend importantly on a country’s capacity to develop and apply new technologies, our universities are envied around the world. The payoffs - in terms of the flow of expertise, new products and start-up companies, for example - have been impressive. Here, perhaps the most frequently cited measures of our success have been the emergence of significant centers of commercial innovation and entrepreneurship where creative ideas flow freely between local academic scholars and those in industry. Not all that long ago, it was easy to recite a relatively short list of places where these activities were clustered. But we have witnessed in recent years a great multiplicity of these centers of innovation. State support, both for the university system and for small businesses, has been an important element in the vitality of these centers. Certainly, if we are to remain preeminent in transforming knowledge into economic value, the US system of higher education must remain the world’s leader in generating scientific and technological breakthroughs and in preparing workers to meet the evolving demands for skilled labor. With two thirds of our high school graduates now enrolling in college and an increasing proportion of adult workers seeking opportunities for retooling, our institutions of higher learning increasingly bear an important responsibility for ensuring that our society is prepared for the demands of rapid economic change. Equally critical to our investment in human capital is the quality of education in our elementary and secondary schools. As you know, the results of international comparisons of student achievement in mathematics and science, which indicated that performance of US twelfth-grade students fell short of their peers in other countries, heightened the debate about the quality of education below the college level. To be sure, substantial reforms in math and science education have been under way for some time, and I am encouraged that policymakers, educators and the business community recognize the significant contribution that a stronger elementary and secondary education system will make in boosting the potential productivity of new generations of workers. I hope that we will see that the efforts to date have paid off in raising the achievement of US students when the results of the 1998-99 international comparisons for eighth graders are published. Whatever the outcome, the pressures to advance our education system will continue to be intense. As the conceptual share of the value added in our economic processes expands further, the ability to think abstractly will be increasingly important across a broad range of professions. Critical awareness and the abilities to hypothesize, to interpret and to communicate are essential elements of successful innovation in a conceptual-based economy. As with many skills, such learning is most effective when it is begun at an early age. And most educators believe that exposure to a wide range of subjects - including literature, music, art and languages - plays a considerable role in fostering the development of these skills. As you know, school districts are also being challenged to evaluate how new information technologies can be best employed in their curricula. Unfortunately, this goal has too often been narrowly interpreted as teaching students how to type on the computer or permitting students to research projects over the internet. Incorporating new technologies into the educational process is indeed likely to be an important element in improving our schools, but it must involve more than simply wiring the classroom. Human capital - in the form of our teachers - and technology are complements in producing education output just as they are in other business activities. To achieve the most effective outcome from new technologies, we must provide teachers with the necessary training to use them effectively and to provide forums for teachers and education researchers to share ideas and approaches on how best to integrate technology into the curriculum. And we must create partnerships among the states, the school systems, labor and industry to develop appropriate standards and guidelines for the teaching of information technology in the classroom. A crucial concern today - and I know that the National Governors’ Association is working hard to address this issue - is that the supply of qualified teachers will be insufficient to meet the demand. Indeed, a substantial number of teachers are scheduled to retire over the next decade, and how to replace them and meet the additional demand from rising enrollments is certain to be a significant challenge in the years ahead. Finally, the pressure to enlarge the pool of skilled workers also means that we must strengthen the significant contributions of other types of training and educational programs, especially for those with lesser skills. It is not enough to create a job market that has enabled those with few skills to finally grasp the first rung of the ladder to achievement. More generally, we must ensure that our whole population receives an education that will allow full and continuing participation in this dynamic period of American economic history. We need to foster a flexible education system - one that integrates work and training and that serves the needs both of experienced workers at different stages in their careers and of students embarking on their initial course of study. Community colleges, for example, have become important providers of job skills training not just for students who may eventually move on to a four-year college or university but for individuals with jobs - particularly older workers seeking to retool or retrain. The increasing availability of courses that can be “taken at a distance” over the internet means that learning can more easily occur outside the workplace or the classroom - an innovation that may be particularly valuable for states with large rural populations for whom access to traditional classroom learning is more difficult. In summary, we are in a period of rapid innovation that is bringing with it enormous opportunities to enhance living standards for a large majority of Americans. Our ability to take advantage of these opportunities is not only influenced by national policies but is also determined importantly at the state level. States with more flexible labor markets, skilled work forces and a reputation for supporting innovation and entrepreneurship will be prime locations for firms at the cutting edge of technology. Not all new enterprises will succeed, of course. But many will, and those that do will provide the impetus for further economic progress and expanding opportunities in their communities. Your leadership as policymakers will be a key element in promoting an environment in which you join with others in business, labor and education to realize the potential that technological change has for bringing substantial and lasting benefits to our economy.
|
board of governors of the federal reserve system
| 2,000 | 7 |
Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the Financial Crisis Conference, Council on Foreign Relations, held in New York, on 12 July 2000.
|
Alan Greenspan: Global challenges Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the Financial Crisis Conference, Council on Foreign Relations, held in New York, on 12 July 2000. * * * I would like to thank you for the opportunity to address this distinguished audience this evening. The issues you have raised these two days are as interesting and timely as they are challenging. The evidence in recent years has become increasingly persuasive that the greater is the degree of openness in cross-border trade in goods and services, the faster will be the rate of growth of an economy and eventually the higher its standard of living. What we do not know for sure, but strongly suspect, is that one major reason for the continued rapid growth in world trade is the accelerating expansion of global finance. This acceleration itself appears to require ever-newer forms and layers of financial intermediation. Certainly, the emergence of a highly liquid foreign exchange market has facilitated basic forex transactions, and the availability of a wider array of financial instruments has allowed the development of more complex hedging strategies that have enabled producers and investors to better allocate risk. This owes largely to the ability of derivatives and other modern financial products to unbundle complex risks in ways that enable each counterparty to choose the combination of risks necessary to advance its business strategy and to eschew those combinations that do not. More efficient allocation of risk facilitates portfolio investment strategies, enhances the lower-cost financing of real capital formation on a worldwide basis, and hence leads to an expansion of cross-border trade in goods and services and rising standards of living. Notwithstanding the demonstrable advantages of what can aptly be described as a new international financial system, the Mexican financial breakdown in late 1994 and the recent episodes in East Asia and elsewhere have raised questions about the inherent stability of this new system. More disturbing has been our inability to anticipate these crises in advance. Of course, something in the nature of such events may preclude their being foreseen, in that the market forces that produce a crisis would likely fend it off if it were anticipated. However, even if we are not able to readily forecast an erosion in the level of confidence, we may at least have the capacity to put preventive, financial shock-absorbing measures in place, thereby lowering the probability of the onset of crisis. With this motivation, those of us active in evolving global markets - whether as private sector participants or as policy officials - have been endeavoring to understand the sources of instability that have flared up at times in recent years. It certainly appears to be the case that many open emerging market economies that have become active participants in the global economy over the past decade or so were exposed to a huge expansion in capital inflows that their economic and financial systems were not yet ready to absorb. These flows had been engendered by the increasing diversification out of industrial country investment portfolios, induced in part by significant capital gains prior to the onset of the crisis in East Asia. Net private capital inflows into emerging markets roughly quadrupled between 1990 and 1997. Such diversification was directed particularly at those economies in Asia that had been growing so vigorously through the 1970s, 1980s, and into the 1990s - the so-called “Asian tigers”. In the event, these economies were ill-prepared to absorb such volumes of funds, especially, as I shall shortly argue, because there were inadequate alternatives to banks for financial intermediation backup. More generally, there were simply not enough productive investment opportunities to yield the returns that investors in industrial countries were seeking. It was perhaps inevitable then that the excess cash found its way in too many instances into ill-conceived and unwisely financed ventures, including many in real estate. What appeared to be a successful locking of currencies onto the dollar over a period of years in East Asia led, perhaps not unexpectedly, to large borrowings of relatively cheap dollars that, in turn, were lent unhedged, at elevated domestic interest rates that reflected unheeded devaluation risk premiums. When the extent of such unhedged dollar borrowings finally came to be recognized as excessive, as was almost inevitable, the exchange rate broke. Although it might seem that the full consequences were predictable, they were not. Problems with imprudently financed real estate investments emerge with chronic frequency around the globe without triggering the size of the collapse experienced in East Asia in 1997. In the case of the East Asian economies, the magnitude of the crisis became evident only when the normal economic buffers that are available to absorb shocks were so readily breached under pressure. It has taken the long-standing participants in the international financial community many decades to build sophisticated financial and legal infrastructures that buffer shocks and limit systemic fallout from market disturbances. Those infrastructures discourage speculative attacks against a well-entrenched currency because financial systems are robust and are able to withstand vigorous policy responses to such attacks. However, the institutions of the newer participants in global finance had not been tested, until recently, against the rigors of major league pitching, to use a baseball analogy. These recent crises have underscored certain financial structure vulnerabilities that are not readily assuaged in the short run but, nonetheless, will be increasingly important to address in any endeavor to build formidable buffers against financial stress. Among the most important, in my judgment, is the development of alternatives that enable financial systems under stress to maintain an adequate degree of financial intermediation even should their main source of intermediation, whether banks or capital markets, freeze up in a crisis. The existence of multiple avenues of financial intermediation has served the United States well in recent decades, especially during the credit crunch of the late 1980s and more recently when our capital markets froze up in 1998 following the Russian default. As I indicated in a World Bank/IMF seminar last year, for a time following the Russian default not even investment-grade bond issuers could find reasonable takers. Although Federal Reserve easing in the midst of this turmoil doubtless contributed to improved conditions, it is not credible that our actions provide the whole explanation for the dramatic restoration of most, though not all, markets in a matter of weeks. The seizure appeared too deep seated to be readily unwound solely by a cumulative 75-basis-point ease in overnight rates. Arguably, at least as relevant was the existence of financial institutions, especially commercial banks, that replaced the intermediation function of temporarily disrupted public capital markets. In the United States, as corporate debt issuance fell under these circumstances, commercial bank lending picked up, helping to fill in much of the funding gap. Even though bankers also moved significantly to limit risk exposure, previously committed lines of credit, in conjunction with Federal Reserve ease, provided an important partial backstop to business financing. Conversely, when American banks seized up in 1990 as a consequence of a collapse in the value of real estate collateral, the capital markets, largely unaffected by the decline in values, were able to substitute for the loss of bank financial intermediation. Without the availability of financing through the capital market, the mild recession of 1991 likely would have been far more severe. Multiple sources of intermediation are not strictly an American phenomenon, of course. Sweden, for example, also has a corporate sector with a variety of non-banking funding sources. The speed with which their financial system overcame an early 1990 real estate crisis offers a stark contrast with the long-lasting problems of Japan, whose financial system is the archetype of financial intermediation that relies almost exclusively on banks. This leads one to wonder whether East Asia’s recent problems would have been less severe had those economies not relied so heavily on banks as their principal means of financial intermediation. One can readily understand that the purchase of unhedged short-term dollar liabilities to be invested in Thai baht domestic loans (counting on the dollar exchange rate to hold) would at some point trigger a halt in lending by Thailand’s banks. But did the economy need to collapse with it? Had a functioning capital market existed, the outcome might well have been far more benign. Before the crisis broke there was little reason to question the three decades of phenomenally solid East Asian economic growth, largely financed through the banking system, so long as rapidly expanding bank credit outpaced lagging losses and hence depressed the ratio of non-performing loans to total bank assets. The failure to have alternative forms of intermediation was of little consequence so long as the primary means worked. That is, the lack of a spare tire is of no concern if you do not get a flat. East Asia had no spare tires. The United States did in 1990 and again in 1998. Banks, being highly leveraged institutions, have, throughout their history, periodically fallen into crisis. When these institutions were the sole source of finance, their difficulties often pulled their economies down as well. One can wonder whether in 19th century America, when banks were also virtually the sole intermediary, the numerous banking crises would have periodically disabled our economy as they did had alternative means of intermediation been available. Regrettably, even with our best efforts, multiple intermediation buffers and other long-term initiatives cannot be created and implemented overnight. Efforts in that direction fortunately are being furthered among a number of emerging economies. In the interim, it is essential that we employ the current period of relative international financial stability to address as best we can some of the more evident short-run potentials for crises. To repeat, we do not, and probably cannot, know the precise nature of the next international financial crisis. That there will be one is as certain as the persistence of human financial indiscretion. We can be reasonably sure that it will not be exactly the same as past crises. Crises never are the same because market participants do not, as readily as supposed, repeat their mistakes of the past. Therefore, we need flexible institutions that can adapt to the unforeseeable needs of the next crisis, not financial Maginot Lines that endeavor to fend off revisiting previous crises that will not be replicated. While we may not be fully knowledgeable about all of the ramifications of our new international financial structure, some characteristics have become increasingly apparent. First, there are no signs as yet that the globalization process is about to stop or, excluding the one-off effect of the introduction of the euro, even slow down perceptibly in the immediate future. The consequent risks are unavoidable. Thus, we need to proceed expeditiously with the tasks of designing and implementing those improvements in both the short-term and the long-term buffers that have a reasonable prospect of protecting the international financial architecture in the years ahead. Extensive efforts of recent years to bolster our international financial structure through enhanced regulatory supervision have too often proved ineffective. Fortunately, there are good reasons to believe that properly structured, the markets themselves can provide the self-correcting discipline that is so necessary to financial stability. However, for markets to perform this job, participants need to have information about counterparties and market leverage, for example, and this information must be relevant, timely and accurate. A high level of transparency in the way domestic finance operates and is supervised is essential if investors are to make more knowledgeable commitments and supervisors are to judge the soundness of such commitments by the financial institutions that they supervise. I find it difficult to believe, for example, that the crises in Thailand and Korea would have been nearly so virulent had their central banks published data prior to the crises on net reserves instead of the not very informative gross reserve positions only. Some private capital inflows would almost surely have been withheld, and policymakers would have been forced to make hard choices more promptly if evidence of difficulty had emerged earlier. Better information and greater transparency, however, are not enough. The right incentive structure has to be in place as well. Private capital markets are the fundamental building block of the capitalist system of resource allocation across activities and over time. Such markets can function properly only if investors bear the costs of their bad decisions and bad luck and reap the benefits of their good decisions and good luck. That is, if risky investments to emerging market economies, for example, turn out poorly - as risky investments are wont to do on occasion - governments or international financial institutions should not endeavor to shield investors from loss. This is as it should be, since investors earn premiums to compensate for the risks of such investments. Efforts to bail out investors, no matter how well intentioned, run the danger of encouraging excessive risk-taking down the road by, in effect, overcompensating risk bearing. Despite the increased sophistication and complexity of financial instruments, it is not possible to take account in today’s market transactions of all possible future outcomes. Markets operate under uncertainty. It is therefore crucial to market performance that participants manage their risks properly, and this is true at the national and official levels of both industrial and developing countries as well as at the level of individual private-sector firms. To the extent that policymakers in industrial or emerging economies are unable to anticipate or evaluate the types of complex risks that the newer financial technologies are producing, the answer, as it always has been, is less leverage - that is less debt, more equity and hence a larger buffer against adversity and contagion. The type of investment instruments issued also can help contain financial stresses. It is no doubt more effective to have mechanisms that allow losses to show through regularly and predictably than to have them allocated by some official entity in the wake of default. In the former case, the private sector will be better able to price risk, making the judgment about new credit decisions more informed. This is particularly so in the international arena, where national sovereignty severely limits the reach of national bankruptcy laws as well as the scope for any realistic hopes for international bankruptcy procedures. It is worth noting that many existing investment instruments already have such desirable properties. Capital losses on equities, for example, spread the costs of the Asian crisis across issuers and investors. From June 1997 through the trough of the crises in August 1998, emerging market equity losses worldwide are estimated to have been more than $1 trillion. At the same time, the scope for downward revisions to dividends during a crisis allows issuing firms to cushion some of their cash flow pressures. Moreover, over the same June 1997 to August 1998 period, losses to foreign investors in emerging market fixed income instruments amounted to $80 billion. I might note that with a flexible exchange rate regime the value of investments in local currency debt will fluctuate with the value of major foreign currencies such as the dollar, allowing for additional regularity in price adjustment. Private market processes have served this country and the world economy well to date, and we should rely on them as much as possible as we go forward. This is not to say that the official sector will have no role to play in the next global crisis, or the one after that. Official safety nets and interventions cannot be eliminated entirely. There are limits to the size and extent of the shocks that the private sector can manage, at least in the short run, without official assistance. However, official financial support should be kept to a minimum for reasons of fiscal prudence, resource allocation efficiency and avoidance of moral hazard over the long run. There is, of course, in any economic system the necessity for sound monetary and fiscal policies, the absence of which was so often the cause of earlier international financial crises. With increased emphasis on private international capital flows, especially interbank flows, private misjudgments within flawed economic structures have been the major contributors to recent problems. But inappropriate macroeconomic policies also have been a factor for some emerging market economies. We may be in a rapidly evolving international financial system with all the bells and whistles of the so-called new economy. But the old economy rules of prudence are as formidable as ever. We violate them at our own peril.
|
board of governors of the federal reserve system
| 2,000 | 7 |
Testimony of Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Committee on Banking, Housing, and Urban Affairs of the US Senate on 20 July 2000.
|
Alan Greenspan: Federal Reserve’s report on monetary policy Testimony of Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, before the Committee on Banking, Housing, and Urban Affairs of the US Senate on 20 July 2000. * * * Mr Chairman and other members of the Committee, I appreciate this opportunity to present the Federal Reserve’s report on monetary policy. The Federal Reserve has been confronting a complex set of challenges in judging the stance of policy that will best contribute to sustaining the strong and long-running expansion of our economy. The challenges will be no less in coming months as we judge whether ongoing adjustments in supply and demand will be sufficient to prevent distortions that would undermine the economy’s extraordinary performance. For some time now, the growth of aggregate demand has exceeded the expansion of production potential. Technological innovations have boosted the growth rate of potential, but as I noted in my testimony last February, the effects of this process also have spurred aggregate demand. It has been clear to us that, with labor markets already quite tight, a continuing disparity between the growth of demand and potential supply would produce disruptive imbalances. A key element in this disparity has been the very rapid growth of consumption resulting from the effects on spending of the remarkable rise in household wealth. However, the growth in household spending has slowed noticeably this spring from the unusually rapid pace observed late in 1999 and early this year. Some argue that this slowing is a pause following the surge in demand through the warmer-than-normal winter months and hence a reacceleration can be expected later this year. Certainly, we have seen slowdowns in spending during this near-decade-long expansion that have proven temporary, with aggregate demand growth subsequently rebounding to an unsustainable pace. But other analysts point to a number of factors that may be exerting more persistent restraint on spending. One they cite is the flattening in equity prices, on net, this year. They attribute much of the slowing of consumer spending to this diminution of the wealth effect through the spring and early summer. This view looks to equity markets as a key influence on the trend in consumer spending over the rest of this year and next. Another factor said by some to account for the spending slowdown is the rising debt burden of households. Interest and amortization as a percent of disposable income have risen materially during the past six years, as consumer and especially mortgage debt has climbed and, more recently, as interest rates have moved higher. In addition, the past year’s rise in the price of oil has amounted to an annual $75 billion levy by foreign producers on domestic consumers of imported oil, the equivalent of a tax of roughly 1 percent of disposable income. This burden is another likely source of the slowed growth in real consumption outlays in recent months, though one that may prove to be largely transitory. Mentioned less prominently have been the effects of the faster increase in the stock of consumer durable assets - both household durable goods and houses - in the last several years, a rate of increase that history tells us is usually followed by a pause. Stocks of household durable goods, including motor vehicles, are estimated to have increased at nearly a 6 percent annual rate over the past three years, a marked acceleration from the growth rate of the previous ten years. The number of cars and light trucks owned or leased by households, for example, apparently has continued to rise in recent years despite having reached nearly 1¾ vehicles per household by the mid-1990s. Notwithstanding their recent slowing, sales of new homes continue at extraordinarily high levels relative to new household formations. While we will not know for sure until the 2000 census is tabulated, the surge in new home sales is strong evidence that the growth of owner-occupied homes has accelerated during the past five years. Those who focus on the high and rising stocks of durable assets point out that even without the rise in interest rates, an eventual leveling out or some tapering off of purchases of durable goods and construction of single-family housing would be expected. Reflecting both higher interest rates and higher stocks of housing, starts of new housing units have fallen off of late. If that slowing were to persist, some reduction in the rapid pace of accumulation of household appliances across our more than hundred million households would not come as a surprise, nor would a slowdown in vehicle demand so often historically associated with declines in housing demand. Inventories of durable assets in households are just as formidable a factor in new production as inventories at manufacturing and trade establishments. The notion that consumer spending and housing construction may be slowing because the stock of consumer durables and houses may be running into upside resistance is a credible addition to the possible explanations of current consumer trends. This effect on spending would be reinforced by the waning effects of gains in wealth. Because the softness in outlay growth is so recent, all of the aforementioned hypotheses, of course, must be provisional. It is certainly premature to make a definitive assessment of either the recent trends in household spending or what they mean. But it is clear that, for the time being at least, the increase in spending on consumer goods and houses has come down several notches, albeit from very high levels. In one sense, the more important question for the longer-term economic outlook is the extent of any productivity slowdown that might accompany a more subdued pace of production and consumer spending, should it persist. The behavior of productivity under such circumstances will be a revealing test of just how much of the rapid growth of productivity in recent years has represented structural change as distinct from cyclical aberrations and, hence, how truly different the developments of the past five years have been. At issue is how much of the current downshift in our overall economic growth rate can be accounted for by reduced growth in output per hour and how much by slowed increases in hours. So far there is little evidence to undermine the notion that most of the productivity increase of recent years has been structural and that structural productivity may still be accelerating. New orders for capital equipment continue quite strong - so strong that the rise in unfilled orders has actually steepened in recent months. Capital-deepening investment in a broad range of equipment embodying the newer productivity-enhancing technologies remains brisk. To be sure, if current personal consumption outlays slow significantly further than the pattern now in train suggests, profit and sales expectations might be scaled back, possibly inducing some hesitancy in moving forward even with capital projects that appear quite profitable over the longer run. In addition, the direct negative effects of the sharp recent runup in energy prices on profits as well as on sales expectations may temporarily damp capital spending. Despite the marked decline over the past decades in the energy requirements per dollar of GDP, energy inputs are still a significant element in the cost structure of many American businesses. For the moment, the dropoff in overall economic growth to date appears about matched by reduced growth in hours, suggesting continued strength in growth in output per hour. The increase of production worker hours from March through June, for example, was at an annual rate of ½ percent compared with 3¼ percent the previous three months. Of course, we do not have comprehensive measures of output on a monthly basis, but available data suggest a roughly comparable deceleration. A lower overall rate of economic growth that did not carry with it a significant deterioration in productivity growth obviously would be a desirable outcome. It could conceivably slow or even bring to a halt the deterioration in the balance of overall demand and potential supply in our economy. As I testified before this committee in February, domestic demand growth, influenced importantly by the wealth effect on consumer spending, has been running 1½ to 2 percentage points at an annual rate in excess of even the higher, productivity-driven, growth in potential supply since late 1997. That gap has been filled both by a marked rise in imports as a percent of GDP and by a marked increase in domestic production resulting both from significant immigration and from the employment of previously unutilized labor resources. I also pointed out in February that there are limits to how far net imports - or the broader measure, our current account deficit - can rise, or our pool of unemployed labor resources can fall. As a consequence, the excess of the growth of domestic demand over potential supply must be closed before the resulting strains and imbalances undermine the economic expansion that now has reached 112 months, a record for peace or war. The current account deficit is a proxy for the increase in net claims against US residents held by foreigners, mainly as debt, but increasingly as equities. So long as foreigners continue to seek to hold ever-increasing quantities of dollar investments in their portfolios, as they obviously have been, the exchange rate for the dollar will remain firm. Indeed, the same sharp rise in potential rates of return on new American investments that has been driving capital accumulation and accelerating productivity in the United States has also been inducing foreigners to expand their portfolios of American securities and direct investment. The latest data published by the Department of Commerce indicate that the annual pace of direct plus portfolio investment by foreigners in the US economy during the first quarter was more than two and one-half times its rate in 1995. There has to be a limit as to how much of the world’s savings our residents can borrow at close to prevailing interest and exchange rates. And a narrowing of disparities among global growth rates could induce a narrowing of rates of return here relative to those abroad that could adversely affect the propensity of foreigners to invest in the United States. But obviously, so long as our rates of return appear to be unusually high, if not rising, balance of payments trends are less likely to pose a threat to our prosperity. In addition, our burgeoning budget surpluses have clearly contributed to a fending off, if only temporarily, of some of the pressures on our balance of payments. The stresses on the global savings pool resulting from the excess of domestic private investment demands over domestic private saving have been mitigated by the large federal budget surpluses that have developed of late. In addition, by substantially augmenting national saving, these budget surpluses have kept real interest rates at levels lower than they would have been otherwise. This development has helped foster the investment boom that in recent years has contributed greatly to the strengthening of US productivity and economic growth. The Congress and the Administration have wisely avoided steps that would materially reduce these budget surpluses. Continued fiscal discipline will contribute to maintaining robust expansion of the American economy in the future. Just as there is a limit to our reliance on foreign saving, so is there a limit to the continuing drain on our unused labor resources. Despite the ever-tightening labor market, as yet, gains in compensation per hour are not significantly outstripping gains in productivity. But as I have argued previously, should labor markets continue to tighten, short of a repeal of the law of supply and demand, labor costs eventually would have to accelerate to levels threatening price stability and our continuing economic expansion. The more modest pace of increase in domestic final spending in recent months suggests that aggregate demand may be moving closer into line with the rate of advance in the economy’s potential, given our continued impressive productivity growth. Should these trends toward supply and demand balance persist, the ongoing need for ever-rising imports and for a further draining of our limited labor resources should ease or perhaps even end. Should this favorable outcome prevail, the immediate threat to our prosperity from growing imbalances in our economy would abate. But as I indicated earlier, it is much too soon to conclude that these concerns are behind us. We cannot yet be sure that the slower expansion of domestic final demand, at a pace more in line with potential supply, will persist. Even if the growth rates of demand and potential supply move into better balance, there is still uncertainty about whether the current level of labor resource utilization can be maintained without generating increased cost and price pressures. As I have already noted, to date costs have been held in check by productivity gains. But at the same time, inflation has picked up - even the core measures that do not include energy prices directly. Higher rates of core inflation may mostly reflect the indirect effects of energy prices, but the Federal Reserve will need to be alert to the risks that high levels of resource utilization may put upward pressure on inflation. Moreover, energy prices may pose a challenge to containing inflation. Energy price changes represent a one-time shift in a set of important prices, but by themselves generally cannot drive an ongoing inflation process. The key to whether such a process could get under way is inflation expectations. To date, survey evidence, as well as readings from the Treasury’s inflation-indexed securities, suggests that households and investors do not view the current energy price surge as affecting longer-term inflation. But any deterioration in such expectations would pose a risk to the economic outlook. As the financing requirements for our ever-rising capital investment needs mounted in recent years - beyond forthcoming domestic saving - real long-term interest rates rose to address this gap. We at the Federal Reserve, responding to the same economic forces, have moved the overnight federal funds rate up 1¾ percentage points over the past year. To have held to the federal funds rate of June 1999 would have required a massive increase in liquidity that would presumably have underwritten an acceleration of prices and, hence, an eventual curbing of economic growth. By our meeting this June, the appraisal of all the foregoing issues led the Federal Open Market Committee to conclude that, while some signs of slower growth were evident and justified standing pat at least for the time being, they were not sufficiently compelling to alter our view that the risks remained more on the side of higher inflation. As indicated in their forecasts, FOMC members and nonvoting presidents expect that the long period of continuous economic expansion will be extended over the next year and one-half, but with growth at a somewhat slower pace than over the past several years. For the current year, the central tendency of Board members’ and Reserve Bank presidents’ forecasts is for real GDP to increase 4 to 4½ percent, suggesting a noticeable deceleration over the second half of 2000 from its likely pace over the first half. The unemployment rate is projected to remain close to 4 percent. This outlook is a little stronger than anticipated last February, no doubt owing primarily to the unexpectedly strong jump in output in the first quarter. Mainly reflecting higher prices of energy products than had been foreseen, the central tendency for inflation this year in prices for personal consumption expenditures also has been revised up somewhat, to the vicinity of 2½ to 2¾ percent. Given the firmer financial conditions that have developed over the past eighteen months, the Committee expects economic growth to moderate somewhat next year. Real output is anticipated to expand 3¼ to 3¾ percent, somewhat less rapidly than in recent years. The unemployment rate is likely to remain close to its recent very low levels. Energy prices could ease somewhat, helping to trim PCE inflation next year to around 2 to 2½ percent, somewhat above the average of recent years. Conclusion The last decade has been a remarkable period of expansion for our economy. Federal Reserve policy through this period has been required to react to a constantly evolving set of economic forces, often at variance with historical relationships, changing federal funds rates when events appeared to threaten our prosperity, and refraining from action when that appeared warranted. Early in the expansion, for example, we kept rates unusually low for an extended period, when financial sector fragility held back the economy. Most recently we have needed to raise rates to relatively high levels in real terms in response to the side effects of accelerating growth and related demand-supply imbalances. Variations in the stance of policy - or keeping it the same - in response to evolving forces are made in the framework of an unchanging objective - to foster as best we can those financial conditions most likely to promote sustained economic expansion at the highest rate possible. Maximum sustainable growth, as history so amply demonstrates, requires price stability. Irrespective of the complexities of economic change, our primary goal is to find those policies that best contribute to a noninflationary environment and hence to growth. The Federal Reserve, I trust, will always remain vigilant in pursuit of that goal.
|
board of governors of the federal reserve system
| 2,000 | 7 |
Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at a symposium sponsored by the Federal Reserve Bank of Kansas City, Jackson Hole, on 25 August 2000.
|
Alan Greenspan: Global economic integration - opportunities and challenges Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at a symposium sponsored by the Federal Reserve Bank of Kansas City, Jackson Hole, on 25 August 2000. * * * Globalization as most economists understand it involves the increasing interaction of national economic systems. Of necessity, these systems are reasonably compatible and, in at least some important respects, market oriented. Certainly, market-directed capitalism has become the paradigm for most of the world, as central-planning regimes have fallen into disfavor since their undisputed failures around the world in the four decades following World War II. But there remains an active intellectual debate over the elements of capitalism that are perceived as most essential for a productive and civil society. The practical manifestation of that debate can be seen in the stresses on our various political and legal systems. Opposing forces sometimes reflect significantly different underlying views of how societal values should be traded off, but more deeply, they often demonstrate different understandings of the way economies work. Earlier in the postwar period, even we in the West believed that market failure was a common occurrence. To some, this belief justified significant state controls and frequent intervention on the microeconomic level to improve, as they saw it, the functioning of markets and to maintain economic stability and growth. At the macroeconomic level, an exploitable tradeoff between unemployment and inflation was widely believed to exist, and a little inflation was perceived as useful to prime the pump of prosperity. Remnants of those views, of course, remain. But it is remarkable how far economic opinions and “conventional wisdom” have shifted since the 1970s. At the risk of some oversimplification, there has been a noticeable reversion in thinking toward 19th-century liberalism, with the consequence that deregulation and privatization have become policies central to much governmental reform. To a marked degree, this shift in policy orientation reflects a response to technologically driven globalization. By lowering the costs of transactions and information, technology has reduced market frictions and provided significant impetus to the process of broadening world markets. Expanding markets, in turn, have both increased competition and rendered many forms of intervention either ineffective or perverse. The recognition of this prosperity-enhancing sea-change in world markets and, in that context, of the counterproductive consequences of pervasive intervention has led many governments to reduce tariffs and trade barriers and, where necessary, to deregulate markets. These actions themselves have further promoted the very globalization that, interacting with advancing technology, spurred the deregulatory initiatives in the first place. The result of this process has been an advance and diffusion of technical change that has raised living standards in much of the world. The conceptual battleground has moved far from the stark terms of the earlier capitalist-socialist confrontations. The failed experiment in central planning in Eastern Europe and the Soviet Union after World War II has largely muted the arguments of most ardent socialist planners. The debate has now shifted to the nature and extent of actions appropriate for governments to take in order to ameliorate some of the less desirable characteristics that are perceived to accompany unfettered competition. But unlike in much of the nineteenth century, little unfettered competition is actually practiced in today’s world. In large part, driven by the value standards of our societies that developed out of the Great Depression, some government regulation is practiced virtually everywhere. Nonetheless, it has become generally understood that governmental actions often hinder incentives to investment by increasing uncertainties, boosting risk premiums, and raising costs. Even among those who deride the more unbridled forms of capitalism, there is a growing awareness that many attempts to tame such regimes are not without cost in terms of economic growth and the average living standards of a nation. A recent manifestation of these costs can be seen in the lower level of high-tech capital investment in continental Europe, on average, and in Japan, relative to that in the United States. Arguably, this outcome has resulted to an important degree from the particular legal structures and customs that govern labor relations in much of Europe and Asia. By choice over the decades, Europe, for example, has endeavored to protect its workers from some of the presumed harsher aspects of free-market competition. To discourage layoffs, discharging employees was made a difficult and costly process in comparison with that in the United States. By law and by custom, American employers have faced many fewer impediments in recent years to releasing employees. This difference is important in our new high-tech world because much, if not most, of the rate of return from the newer technologies results from cost reduction, which on a consolidated basis largely means the reduction of labor costs. Consequently, legal restraints on the ability of firms to readily implement such cost reductions lower the prospective rates of return on the newer technologies and, thus, the incentives to apply them. As a result, even though these technologies are available to all, the intensity of their application and the accompanying elevation in the growth of productivity are more clearly evident in the United States and other countries with fewer impediments to implementation. Parenthetically and counterintuitively, the increased ease of layoffs in the United States, by reducing the risks of hiring by American employers, has contributed to a higher rate of employment in the United States compared with the vast majority of our major trading partners. A particular irony in all this is that Europeans have been finding investments in the United States increasingly attractive and have accounted for an increasing share of the expanding total of foreign investment in U.S. direct and portfolio assets. In an effort to raise returns on domestic assets, many governments, European and others, are being led away from former dirigiste regimes to place greater reliance on markets. These governments see such a direction as necessary in order to enable their firms and workers to achieve the efficiencies required to meet the rigors of international competition. Recent plans for tax reforms, significant initiatives to create more flexible labor markets, and ongoing steps toward greater privatization in Europe and elsewhere underscore the extent to which views have changed in recent years. But it is clearly pragmatism, not ideology, that is the main driving force in these evolving views. The structural policy adjustments in Western Europe and Japan, not to mention the efforts in China and Russia to move toward market capitalism, are being motivated, for the most part, by the evident ability of market competition to elevate living standards. Thus, despite the meaningfully different views initially held of the way the world does, and should, work, powerful global competitive forces appear for now to be driving the economic and legal paradigms of many nations into closer alignment around a more competitive market capitalism. It is by no means self-evident, however, that these trends will eventually lead to world convergence of economic regimes and to agreement about the conceptual framework that such a convergence would likely require. Certainly, the demonstrated ability of relatively unfettered markets to raise living standards over time creates considerable incentive for movement in that direction. But the speed of that movement - indeed, its very persistence over time - is far less clear. Even among liberal democracies, one can still find deep-seated antipathy toward free-market competition and its partner, creative destruction, to use Joseph Schumpeter’s now famous insight. While recognizing the efficacy of capitalism to produce wealth, there remains considerable unease among some segments about the way markets distribute that wealth and about the effects of raw competition on the civility of society. Thus, should recent positive trends in economic growth falter, it is quite imaginable that support for market-oriented resource allocation will wane and the latent forces of protectionism and state intervention will begin to reassert themselves in many countries, including the United States. For now, the process of globalization is being aided by strengthening economic growth, which is clearly being driven by an accelerating application of new insights. Technological innovation, however, arguably comes in bunches as new discoveries feed on one another to push forward innovation until the effects of the initial impetus finally peter out. The vast electrification of our societies and, before that, the spread of the railroads helped elevate economic growth for a considerable period of time. But the pace of growth eventually slowed when full, or near-full, exploitation of the newer technologies was achieved. The most recent wave of technology has engendered a pronounced rise in American rates of return on high-tech investments, which has led to a stepped-up pace of capital deepening and increased productivity growth. Indeed, it is still difficult to find credible evidence in the United States that the rate of structural productivity growth has stopped increasing. That is, even after stripping out the significant impact on productivity acceleration of the recent shape of the business cycle, the second derivative of output per hour still appears to be positive. If we knew at what stage of the current technological wave we were in, we could, I assume, confidently project when these elevated rates of change in long-term earnings expectations, productivity growth, and, hence, wealth creation would return to a more historically average pace. For it seems evident that once such a wave begins to crest, much of the self-reinforcing virtuous cycle presumably fades, as it has in the past. In such a scenario, full development of available technological synergies and their competitive deployment would damp the historically high prospective rates of return on capital investment and slow the pace of capital deepening. The level of structural productivity does not recede, of course, since once gained new technological insights are never lost, but its rate of increase would slow, and projections of long-term profits growth presumably regress back to earlier magnitudes. From any perspective, of course, a tapering-off in productivity acceleration is inevitable at some point in the future. In the past few hundred years for which we have some rough productivity approximations, human ingenuity, even at its best, appears to have rarely produced annual productivity growth approaching double-digit rates for any protracted period of time. Any notable shortfall in economic performance from the standard set in recent years, as I indicated earlier, runs the risk of reviving sentiment against market-oriented systems even among some conventional establishment policymakers. At present, such a shortfall is not anticipated, and such views are not widespread. But they resonate in some of the arguments against the global trading system that emerged in Washington, DC, and Seattle over the past year. Although most of these arguments may be easy to reject, those of us who support continued endeavors to extend market-driven globalization need to understand and, if possible, address the concerns that give rise to the desire to roll back globalization. How the convergence of economic systems toward the most market-oriented capitalist structures will fare if world long-term economic growth trends revert to historic norms is an intriguing question. In the meantime, this extraordinary period of technological advance continues to exhibit great vitality, bringing with it the prospect of further globalization, greater competition, and the resulting improvements in the economic welfare of most of the world’s citizens. It is almost surely the case that, the longer the process of globalization of economic activity continues, the more firmly entrenched will be the gains we are beginning to realize. But our past endeavors at long-term forecasting afford us little confidence in being able to anticipate seminal changes in global economics and finance. We cannot, however, refrain from reflection. That is what this conference is all about.
|
board of governors of the federal reserve system
| 2,000 | 8 |
Speech by Mr Laurence H Meyer, Governor of the Board of Governors of the US Federal Reserve System, before the Bank of Thailand Symposium, Risk Management of Financial Institutions, held in Bangkok, on 31 August 2000.
|
Laurence H Meyer: Why risk management is important for global financial institutions Speech by Mr Laurence H Meyer, Governor of the Board of Governors of the US Federal Reserve System, before the Bank of Thailand Symposium, Risk Management of Financial Institutions, held in Bangkok, on 31 August 2000. * * * I am very pleased to have been invited to address this symposium on the timely and important topic of risk management. Continuing increases in the scale and complexity of financial institutions and in the pace of their financial transactions demand that they employ sophisticated risk management techniques and monitor rapidly changing risk exposures. At the same time, fortunately, advances in information technology have lowered the cost of acquiring, managing and analyzing data, and have enabled considerable and ongoing advances in risk management at leading institutions worldwide. As this symposium illustrates, banks in many emerging market countries are also increasing their focus on risk management in an effort to build more robust and sound financial systems, to remedy weaknesses that were exposed by recent regional problems, and to position themselves to participate more fully in the global economy. Why risk matters Because taking risk is an integral part of the banking business, it is not surprising that banks have been practicing risk management ever since there have been banks - the industry could not have survived without it. The only real change is the degree of sophistication now required to reflect the more complex and fast-paced environment. Even today, however, some simple rules continue to be critical to risk management, and I cite Barings as a negative example. By the simple act of separating front-office from back-office responsibilities, Barings could well have prevented the enormous losses that led to its failure. In addition, Barings’ management never explored how Nick Leeson could have produced such high returns, even though the trading he was authorized to undertake was essentially riskless and thus should not have been so profitable. Had they questioned their “good fortune”, Barings’ management might have uncovered the hidden losses before they became large enough to bankrupt the firm. A third lesson from Barings is that the benefit of risk management is the losses it will prevent, not the additional revenues it will generate. The Asian financial crisis of 1997 illustrates that ignoring basic risk management can also contribute to economy-wide difficulties. The long period of remarkable economic growth and prosperity in Asia masked weaknesses in risk management at many financial institutions. Many Asian banks did not assess risk or conduct a cash flow analysis before extending a loan, but rather lent on the basis of their relationship with the borrower and the availability of collateral - despite the fact that collateral was often hard to seize in the event of default. The result was that loans - including, I might add, loans by foreign banks - expanded faster than the ability of the borrowers to repay. Additionally, because many banks did not have or did not abide by limits on concentrations of lending to individual firms or business sectors, loans to overextended borrowers were often large relative to bank capital, so that when economic conditions worsened, these banks were weakened the most. The Asian crisis also illustrates the potential benefit of more sophisticated risk management practices. Many Asian banks did not adequately assess their exposures to exchange rate risk. Although some banks matched their foreign currency liabilities with foreign currency assets, doing so merely transformed exchange rate risk into credit risk, because their foreign currency borrowers did not have assured sources of foreign currency revenues. Similarly, foreign banks underestimated country risk in Asia. In both cases, institutions seemed to have assumed that stability would continue in the region and failed to consider what might happen if that were not the case. A greater willingness and ability of banks to subject their exposures to stress testing could have highlighted the risks and emphasized the importance of key assumptions. Had they conducted stress tests, some lenders might have seen how exposed they were to changes in exchange rates or to an interruption of steady economic growth. Although avoiding failure is a principal reason for managing risk, global financial institutions also have the broader objective of maximizing their risk-adjusted rate of return on capital, or RAROC. This means not just avoiding excessive risk exposures, but measuring and managing risks relative to returns and to capital. By focusing on risk-adjusted returns on capital, global institutions avoid putting too much emphasis on activities and investments that have high expected returns but equally high or higher risk. This has led to better management decisions and more efficient allocation of capital and other resources. Indeed, bank shareholders and creditors expect to receive an appropriate risk-adjusted rate of return, with the result that banks that do not focus on risk-adjusted returns will not be rewarded by the market. Risk management is clearly not free. In fact, as I will discuss, it’s expensive in both resources and in institutional disruption. But the cost of delaying or avoiding proper risk management can be extreme: failure of a bank and possibly failure of a banking system. A point too often overlooked, however, is that, by focusing on risk-adjusted returns, risk management also contributes to the strength and efficiency of the economy. It does so by providing a mechanism that is designed to allocate resources - initially financial resources but ultimately real resources - to their most efficient use. Projects with the highest risk-adjusted expected profitability are the most likely to be financed and to succeed. The result is more rapid economic growth. I want to emphasize that point. The ultimate gain from risk management is higher economic growth. Without sound risk management, no economy can grow to its potential. Stability and greater economic growth, in turn, lead to greater private saving, greater retention of that saving, greater capital imports and more real investment. All this, from sound risk management. Without it, not only do we lose these gains, but we also incur the considerable costs of bank disruptions and failures that follow from unexpected, undesired and unmanaged risk-taking. Making risks matter to owners and managers One might ask why many Asian banks made the mistake of paying little attention to risk. One answer, to which I have already alluded, is that the many years of strong economic performance by Asian economies lulled banks and their supervisors into complacency regarding risk. That is only part of the answer, however. We have learned that certain prerequisites must exist before bank owners and managers will pay attention to risk. At least some of these prerequisites were absent in many Asian countries. One prerequisite is that there be no implicit or explicit government guarantees for bank owners and managers. Their incentives to manage risk and avoid insolvency will be severely blunted if insolvent banks are merged with stronger banks or simply allowed to continue operating without owners losing their stake or managers losing their jobs. Similarly, bank creditors will exert no market discipline if they run no risk of loss on their claims on banks. Thus, guarantees for bank creditors will also blunt the incentive of banks to control risk. Market discipline also requires adequate accounting and disclosure standards, to enable bank investors to judge a bank’s condition accurately. An important complement to market discipline in promoting sound risk management is effective bank supervision. To be effective, though, bank supervisors must have the ability to assess a bank’s condition, especially the condition of the loan portfolio, and they must have the authority to require adequate provisions for loan losses. After ensuring that loan loss reserves are adequate, supervisors must have the authority to close banks that are insolvent, wiping out owners or shareholders and removing management. Without such authority - and the willingness to use it - a country effectively has a policy of forbearance, even if officially it does not. Another prerequisite for risk to matter is that there be no government-directed lending, because directed lending carries with it an implicit government guarantee. In Korea, for example, during many years of directed lending to the large chaebols, banks did not develop the skills to assess the risk of these, their largest borrowers. Even after the end of government direction and the implied guarantee, banks still did not insist on receiving full financial information from the chaebols. Consequently, banks could not assess the risk of these loans, even though the risk they faced had become substantial. Prerequisites for risk management There are also prerequisites for banks to develop the ability to measure and manage risk effectively. First, in order to measure risk, the country must have solid accounting and disclosure standards that provide accurate, relevant, comprehensive and timely information so that banks can assess the condition and performance of borrowers and counterparties. To ensure accuracy, accounting systems need to be supplemented by auditing systems and backed up by enforceable legal penalties for providing fraudulent or misleading information to government agencies and outsiders. Banks also need reliable information on the credit history of potential borrowers and on macroeconomic and financial variables that can affect credit and other risks. Additionally, banks need a staff with sufficient expertise in risk management to identify and evaluate risk. Implicit in most methods of evaluating credit risk is the assumption that the probability of repayment depends on the ability of the borrower to repay, in other words, that willingness to repay is not the issue. If repayment depends on whim, then its probability is difficult if not impossible to assess. Thus, an adequate legal system and “credit culture”, in which borrowers are expected to repay and are penalized if they do not, are yet further prerequisites for sound and accurate risk management. The ability to seize the collateral of borrowers in default is essential if banks are to have the incentives and ability to mitigate risk. Without the legal infrastructure - the laws, courts and impartial judges necessary to enforce financial contracts in a timely manner, much of risk management would be for nothing, once the initial decision to extend credit was made. Finally, the potential for conflicts of interest in risk management must be limited. In particular, regulations are needed that restrict and require disclosure of connected lending to bank owners, shareholders or management. Without such regulations, the desire for personal gain may distort the incentives of bank owners and managers to manage risk appropriately. Sound practices in risk management At this juncture, it might be useful if I shared with you developments and practices in one country. Of course, I am most familiar with my own. Throughout the past decade, the Federal Reserve has devoted increased attention to understanding the risk management practices of US banks and has redirected its supervisory efforts to focus on management processes and areas of (perceived) greatest risk. The sheer complexity, volume and pace of transactions in our largest institutions today demand that we evaluate their risks by reviewing the structure and effectiveness of their policies, procedures and controls. While a certain level of “transaction testing” remains important to ensure that internal systems work and that controls are real, we can no longer take comfort in the safety of large, complex banks by independently assessing their overall condition at a specific point in time. Given the financial products on the market today, risk profiles of complex institutions can change too easily and quickly. We need the assurance that the practices that have kept an institution sound so far are likely to keep it sound in the foreseeable future. US bank managers had earlier come to the same conclusion regarding their own abilities to control individual transactions and positions in their worldwide operations. As a result, they have invested steadily in developing prudent and understandable policies and in improving methods for measuring and managing risks, firm-wide. As their institutions and activities grew, they needed a common measure to compare alternative uses of capital and to evaluate the performance of an expanding range of business lines. The now familiar concept of RAROC helped greatly to fill that need. They also required a better process for maintaining quality results, providing management and employees with proper incentives, and detecting problems at an early stage. Along with their size and complexity, their demand for stronger and more objective methods of risk management increased. The financial institutions in the vanguard of risk management have tended to be those most active in international capital markets and derivative activities, where participants are most informed, transactions are most efficient and data are most readily available. They were the ones that, most typically, had not only the expertise and resources to develop sophisticated systems for measuring and evaluating risks, but also the need to do so, given the complexity of their transactions and products. They needed to identify the key factors driving market volatility and to quantify the underlying risks in order to manage their positions and product lines. For trading activities, the value-at-risk measure was an important breakthrough. It gives banks a measure of a portfolio’s largest expected loss during a particular time period for a given level of probability. It provided a statistically sound and easily understood basis for managing market risk and, as you know, also served as the foundation for new regulatory capital requirements for internationally active banks. At most banks, though, market risk is relatively small. Measuring and controlling credit risk is typically far more important and, unfortunately, a much more difficult task. In managing risk, banks must decide which risks to take, which to transfer and which to avoid altogether. Market risks are easily transferred, often through swaps and other derivative products. Unless the institution believes it has a comparative advantage in accepting a particular risk, it is typically sold. During the past decade, the fivefold increase in the notional volume of derivative transactions, to nearly USD 35 trillion for US banks alone, reflects the demand for risk-mitigating products in this area. Certain other risks, such as operating risks and the chance of various idiosyncratic losses, can be reduced through insurance, diversification and internal controls. Accepting credit risk, though, is fundamentally the business of banking and is the activity which most banks see as their principal competitive advantage. Typically, financial market participants have far less information than banks do about the credit quality of individual borrowers, and the terms and conditions of the borrowing arrangements are often complex and structured case-by-case. Credit card loans and certain other retail credits are a notable exception. In recent years, leading banks have devoted increased attention to measuring credit risk and have made important gains, both by employing innovative and sophisticated risk modeling techniques and also by strengthening their more traditional practices. For example, a popular vendor model measures default risk by applying option theory to the market value of a borrower’s equity share price and calculates the probability of a negative net worth. That approach has the important feature of incorporating market assessments of risk into the analysis and is often used to validate a bank’s independent view. Other models, including most internal models of banks, take a more direct approach to calculating the fundamental elements of credit risk. They estimate the probability a borrower will default, based on numerous measures of its stated financial strength; the bank’s exposure given a default, reflecting any unused commitments of the bank to lend; and the expected loss given default, taking into account any collateral or other loss-mitigating features of the credit agreement. Combined, these measures reveal the expected loss, which a bank must know to underwrite and price a credit correctly, as well as to establish adequate loss reserves. However, it is the volatility of this loss and the contribution of the credit to the volatility of the bank’s cash flow on a firm-wide, portfolio basis that is crucial to evaluating capital adequacy. As you can imagine, the process of modeling credit risk is still as much art as science for most banks, requiring many assumptions and subjective judgments as well as substantial amounts of data. Testing the sensitivity of a model’s results to these decisions and maintaining the integrity of the process from start to finish will be a constant challenge. Nevertheless, leading institutions around the world have made substantial progress in measuring risk in recent years, and we expect much more progress in the years ahead as both the regulatory and banking communities devote more attention and resources to the topic. The continually declining cost of technology for storing and analyzing data will help greatly to encourage firms to create the databases necessary to understand credit risk better. This decade may see a watershed for risk management as institutions make far greater use of information. Indeed, I fully anticipate that the revised Basel capital accord now under development will link capital requirements more closely to a bank’s own internal evaluation of risk. For most institutions, though, success in risk management does not require sophisticated models, nor will models alone suffice. Clearly, all banks must take advantage of new technologies and keep pace with market innovations to remain competitive and to survive. That need points to ever more sophisticated risk measurement and management practices. Nevertheless, adhering to the fundamental principles of risk management and adapting sound practices to one’s own situation will remain key. As I indicated earlier, the experience throughout the world with bank failures and other financial crises demonstrates time and again that violations of traditional and long-known management principles produce the largest losses. I gave some examples of this earlier. Technology and financial innovation facilitate risk management and accommodate more sophisticated risk exposures. Financial models are tools and the environment around them must still be properly managed and controlled. I cannot overstate that point. The most sophisticated risk management techniques are useless if the operating environment and management incentives are deficient or if fundamental risk management principles are ignored. Fundamental elements of sound risk management The fundamental elements of sound risk management are easy to describe in the abstract but are far more difficult to apply case-by-case. Each situation is unique, built around the roles and capabilities of individuals and the structure, activities and objectives of the institution. What works for one firm may, of course, be unsatisfactory for another. Moreover, in the context of a particular firm, the definition of a sound or adequate risk management system is ever changing, as new technology accommodates innovation and better information and as market efficiency grows. To remain competitive, institutions must adapt and constantly improve their process. That fact becomes clearer every day. Apart from those contingencies, however, certain basics apply quite generally. In any institution, support for crucial programs must come from the top. Each entity’s senior management and governing board must set the institution’s risk appetite by establishing appropriate policies, limits and standards, and by ensuring that they are followed and enforced. Throughout the institution, risks must then be measured, monitored and reported to key decision-makers. While the complexity and formality may vary widely among institutions, each firm should have clear procedures for assessing risk and evaluating performance over time. There must also be adequate accountability, clear lines of authority and separation of duties between business functions and those involved in risk management and internal control. These elements, and others, are time-tested fundamentals of risk management that do not entail high technology or complex risk measurement techniques. Getting there I very much fear that institutions throughout the world view risk management - often “sophisticated risk management” - as a Holy Grail, without having a real understanding of its fundamental elements and how difficult it can be to impose them. This, I believe, is particularly true in developing countries with traditions or cultures that emphasize relationships more than legally enforceable obligations. Moreover, even for organizations that operate in a culture - such as the United States - that is more risk-oriented, risk management techniques can threaten traditional ways and thoughts. Managers who are already in important decision-making posts and staff who do not yet have the technical skills necessary for risk management are especially likely to find change disruptive, mechanical and lacking in the all-important qualities of judgment and experience that they, of course, already have in considerable abundance. Do not misunderstand me. I am not an opponent of experience and judgment, and I do believe that building a risk management system that is designed to be automatic is foolhardy. My point is that change is difficult at any institution or set of institutions. It is even more difficult if the change is not consistent with the cultural and institutional environment and/or the skills of the management and staff. Difficulty with staff expertise is why the first steps for effective risk management probably should be the least technical and the ones with the best chance for payoff in the short run. I am thinking of accountability, clear lines of authority and responsibility, and - to avoid conflicts of interest - the crucial separation of business line management from risk management and internal control. As I noted earlier, the latter alone could have saved Barings. The next step, I suggest, is to begin thinking about returns in terms of the risk-return nexus, the RAROC I noted earlier. Just that simple way of thinking about an investment decision goes a long way in improving risk management. The best deal - especially for leveraged institutions like banks - is hardly ever the one with the best rate of return. Rates of return are provided to compensate for risk. If rates of return are high, it is because they are compensating for a high level of risk. Risk means variability, and leveraged institutions like banks have little tolerance for loss. That is why banking institutions must reduce and manage risk exposures, think of yields on a risk-adjusted basis, and realize that financial leverage can magnify the impact of losses as well as gains. A critical concern in developing a basic risk management process involves developing or attracting personnel with the skills necessary to apply risk management tools in meaningful ways. In the 18th and 19th centuries, bankers used to send their sons to work for competitors in financial centers in order to learn the latest techniques and then bring them home. Some of that still goes on - although not so much for relatives any more - with time often spent at the organization’s own foreign branch or at a business school rather than at competitors. Expertise was also transferred in the past by local branches of foreign banks training local residents. In the 21st century these are all still excellent ways to import skills in banking in general and risk management in particular. Limiting foreign presence may be the worst thing for local banks - insulating them from competition and making the import of valuable human capital and expertise more costly. Risk management implies significant limits on the ability of highly leveraged financial institutions such as banks to provide badly needed venture capital; it implies that financial systems need more than banks. They need non-bank financial institutions that are less leveraged than banks and have much longer term liabilities. They also need functioning capital markets, including foreign providers of long-term and equity capital. These other elements may constrain local banks, but they bring blessings, too. They create instruments and institutions that are stronger, more diversified and easier to manage during periods of financial stress. They also provide greater stability to financial systems and alternative funding sources for borrowers. Summing up Indeed, this may be a lot to load onto such a seemingly small concept as risk management, but the concept is not really so small. It is fundamental to sound banking and requires, I am afraid, a revolution in many of the world’s banking systems. Risk exists and banks must accept risk if they are to thrive and meet an economy’s needs. But they must manage the risks and recognize them as real. Risk matters. Whether or not it is temporarily ignored, it will eventually come out. Recognizing that fact and dealing with it will benefit lending institutions and the economies in which they operate. Indeed, given globalization, we must all adopt increasingly sophisticated risk management practices in the years ahead.
|
board of governors of the federal reserve system
| 2,000 | 9 |
Speech by Mr Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at a workshop on promoting the use of electronic payments, held at the Federal Reserve Bank of Chicago, Chicago, on 11 October 2000.
|
Roger W Ferguson, Jr: Perspectives on innovation in the retail payments system Speech by Mr Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at a workshop on promoting the use of electronic payments, held at the Federal Reserve Bank of Chicago, Chicago, on 11 October 2000. * * * Electronic commerce and finance are growing rapidly. Media announcements of new payments mechanisms designed to aid electronic commerce have become routine. Some recent predictions look for mobile phones and sophisticated wireless devices eventually to become important tools for conducting electronic commerce and payments. As in the 1960s, business and government officials are being asked to predict the future of electronic payments in the United States. This is understandable. Strategic planning and investments will be shaped by views about the future. Yet the future, by definition always unknowable, is hardest to predict when we are in the midst of a wave of innovation and change. At such times public policy also faces special challenges and opportunities. This morning I would like to offer some thoughts about that earlier period of innovation and change in the banking and payments system that began in the 1960s. I would also like to review briefly our more recent experience, as well as to draw out some lessons for the private sector and public policy. Finally, I would like to provide you with an overview of the recent work of the Federal Reserve’s Payments System Development Committee. Past predictions We often remember the predictions of a checkless society from the mid-1960s as a lesson in the pitfalls of forecasting the future of electronic payments. Today, as a nation, we write something on the order of 65 billion to 70 billion checks each year, and many “electronic” bill presentment and payment services continue to receive paper invoices and send paper checks. Looking back, banks and policymakers in the 1960s were grappling with significant problems created by the growth of economic activity relative to our ability to process paper payments and other financial instruments. At one point, the New York Stock Exchange was regularly closed on Wednesdays in order to catch up on paperwork. At the time, there were also fears that contemporary check-processing systems would not be able to handle further large increases in volume as the economy continued to grow. Deep thought and tremendous effort went into solving what came to be called the “paperwork crisis”. I recently reread some of the material from the mid-1960s, particularly the work of one of my distinguished predecessors at the Fed, George Mitchell. At the time, computers were increasingly being used to automate business processes. Computer and communications costs were predicted to fall, and automation was being discussed with the same sense of high expectation that we hear today. At least three things stand out from the discussions of payments and banking in this earlier era. First, a number of the predictions from the mid-1960s about the payments system were in the end remarkably accurate. The fact that cash and checks have not disappeared should not blind us to the fact that real change has taken place. Many of the retail payments innovations in the 1960s, such as credit cards, debit cards and the automated clearing house, are now taken for granted. In the wholesale financial markets, checks and drafts are rarely used and securities are transferred in book-entry form. Second, some of the analysis of the long-run effects of automation on banking and finance was both insightful and, with hindsight, too conservative. Even in the mid-1960s, it was becoming clear that the combination of computerized banking systems and telecommunications could fundamentally change both business practices and banking regulations. Successive generations of technology, now including the internet, have helped to accelerate the process of change and to create a dynamic financial system. But third, the early analysis of electronic payments also underestimated the transition costs of the rapid automation and probably overestimated the rate at which computing and communications costs would decline. Changing and integrating infrastructure within businesses and banking organizations and convincing enough players to adopt a technology so that investments will yield a reasonable return have posed many challenges. Even a recent survey by the Association of Financial Professionals showed that the integration of corporate accounting and payments systems still presents a challenge to the greater use of electronic bill presentment and payment. In this complex environment, it is hardly surprising that the overall demand for electronic payments to replace a well-functioning paper-based system has tended to grow more slowly than anticipated. Recent trends Thus I suspect that we should be simultaneously optimistic and cautious in our expectations of future retail payment systems, including electronic systems. It is certainly most likely that checks and cash will be with us for a long time. Even though the number of checks written is not measured precisely for the economy as a whole, the number appears to have grown slowly but surely over the past decade - by about 2% per year. Yet over the past 10 years, there has also been a good deal of growth in the use of electronic payments both as a share of non-cash payments and on a per capita basis. For example, we initiate more than 30 billion electronic payments over credit and debit card systems and the ACH. And these retail electronic payments have grown by about 10% per year over the past decade. As a result of these factors, the proportion of checks written compared with the total number of non-cash payments has actually declined from about 80% in 1990 to around 70% in 1999. The share of electronic payments increased by a corresponding amount. This is a significant change for an economy as large and diverse as that of the United States. On a per capita basis, credit cards are still the most intensively used form of electronic payments for retail transactions, although the use of debit cards has recently been growing at double-digit rates. Furthermore, more than half of workers now receive a direct deposit of their paychecks through the ACH. As I noted at the outset, the pace of innovation in the retail payments system has once again picked up. There seems to be a continual stream of announcements about new products and projects, as well as new players and shifting alliances. Some of the products are ingenious new ways to make payments over the internet. Others aim to automate older payments mechanisms, such as projects to convert checks to electronic payments at the point of sale. The competition among all the different actors has intensified, as they jockey for competitive position in the marketplace. Therefore, I put these facts together to conclude that today’s trends might give a hint of the contours of tomorrow’s world. Checks, cash, credit cards and the ACH, the established retail payment tools, will all have a place. However, some of the newer electronic payments mechanisms, including internetbased person-to-person and “C2B” payments mechanisms, will grow from infancy to greater maturity as well. Each of these payments mechanisms will find a niche, and some will break from the pack into general use. Challenges for the private sector The major lesson for the private sector is a challenging one, particularly given its limited resources and its imperative to create shareholder value. I believe that the firms that will succeed in the world of retail payments will have to be prepared to invest simultaneously in modernizing the current retail payment systems - giving them a more electronic and automated backbone - while also experimenting with some of the newer payment tools. It appears that many institutions that thought that the check was going to die off now recognize that they must make strides in improving the security and increasing the automation of this product. However, these same institutions must also make selective investments in the newer and more visionary retail payment mechanisms. I am not in a position to determine for each private-sector firm how to balance these two goals. Managers and directors of these businesses are closer to these decisions and bear greater direct responsibility for the success of the institutions they guide. However, all firms interested in participating in the payment system will have to recognize explicitly the challenge of maintaining the existing system while building the new one. Lessons for public policy What are some of the lessons for public policy that we have learned from our experience with electronic payments since the 1960s? Our general goals of fostering a safe, efficient and accessible payments system have not changed. However, one broad lesson is that in a dynamic economy, markets need to play a key role in guiding the development of infrastructure, including mechanisms like payments systems. This means that innovation and competition will be central to the future development of the payments system - as they are in other areas of the economy. Of course, questions of interoperability between different systems will probably need to be addressed by payments providers. Policymakers for their part should aim to remove barriers to innovation that do not conflict with important public policies and should resist calls to limit competition. A second and related point is that successes and failures are bound to occur as the ultimate users of payments systems choose among competing options for making payments. The lesson for public policy is that it should not be built on a single product, system or vision of the future, no matter how compelling at the time. Instead, policy should be flexible in a way that allows experimentation and change to take place, particularly in the rapidly changing world of electronic payments. A third lesson is that public policy should exercise restraint and resist calls for premature regulation. Users face important trade-offs as they make choices about the use of new payment technologies among key attributes such as cost, convenience, safety and complexity. These trade-offs may even shift depending on the specific parties to a payment or its purpose. Regulations typically make implicit assumptions about these important trade-offs, which may pre-empt adjustments by users and providers of the new technologies. Even well-intentioned regulations can end up addressing the wrong problem or short-circuiting creative innovations. On the other hand, public policy will have to confront genuine and significant problems, when these become clear and are not self-correcting. A final lesson should temper the thinking of both policymakers and payments system innovators. This lesson involves payments system risks: operational, security, fraud, credit, liquidity and legal risks. Many payment innovations are being built on top of older established systems and infrastructure, while others attempt to circumvent more established payment practices. This is part of the process of innovation. At the same time, innovations need to address risk consistently and responsibly. Relevant information about risk should be provided to the users of payment arrangements. As we know, the failure of private-sector innovators to address risk early may ultimately force public policy to prescribe solutions. Payments System Development Committee I would now like to give you an overview of the recent work of the Federal Reserve’s Payments System Development Committee, which I co-chair with Cathy Minehan, President of the Boston Fed. The Board created this committee last year to help follow up on the work of my predecessor, Alice Rivlin, and to help stimulate the Federal Reserve System’s engagement with the private sector on a range of issues involving payments system innovation. Four important activities of the committee are the following: (1) to identify strategies for enhancing the long-term efficiency of the retail payments system, (2) to identify barriers to innovation and work to address those barriers where possible, (3) to monitor market developments, and (4) to conduct workshops and forums that encourage focused discussions with the private sector. The current areas of concentration by the committee include electronic check truncation and presentment, efforts to reduce legal and regulatory barriers to innovation, standards, and future clearing and settlement systems to support electronic commerce. In this age of the internet, the committee’s work on electronic check collection deserves comment. Checks continue to be the most widely used retail payment instrument after currency. At the same time, it has been very difficult for the banking industry to move from a paper-based to an electronic check collection system. There has been experimentation with check truncation and electronic presentment in the United States since the 1960s, with limited success. Recently, the banking industry has shown renewed interest in this topic. The Banking Industry Technology Secretariat (BITS) for example, has endorsed the goal of having their members present at least 50% of their checks electronically by 2001. The Federal Reserve Banks now present about 20% of their checks electronically to more than 3,800 banking organizations. Both the Federal Reserve and the private sector are piloting new arrangements for truncation, presentment and digital imaging. The issue of how to streamline the electronic return of dishonored checks is also being discussed. Against this background, the Payments System Development Committee held a workshop at the Federal Reserve Bank of Boston this past June and invited more than 100 public and private-sector experts to help identify barriers to the greater use of truncation and electronic check presentment, along with steps the Fed and the private sector could take to help address these barriers. The Board released a summary of the Federal Reserve staff analysis of these suggestions early in September, and we will be following up on the suggestions in several areas. One of the promising ideas discussed at the workshop involves a potential reduction in legal barriers to check truncation. The general idea is not only to facilitate the truncation, digital imaging and electronic presentment of checks when this makes economic sense but also to protect the rights of consumers or others to receive a paper check if they want. One means to accomplish this goal, for example, would be to provide a legal foundation that would treat the digital image of a check, or an accurate, machine-readable paper copy of that digital image, as the legal equivalent of the original check. Banking organizations would then have greater flexibility to truncate checks, while allowing banks, other businesses and individuals to receive legally equivalent paper copies of original checks to satisfy business or personal needs. Again, a key feature of this idea is that rights would have to be protected. The Federal Reserve staff has been following up on this idea in discussions with banking, legal, business, consumer and government representatives, and will continue to engage the private sector in dialogue. The committee also expects to pursue initiatives in the area of technical standards, particularly for exchanging electronic checks and paper substitute checks, as well as to discuss new operational concepts for check imaging and ways to test these concepts. We will also look for ways to work with the private sector to inform depository institutions and the public about electronic check collection. Some organizations have suggested that it is more often business considerations than technical issues that hold back participation in electronic check initiatives. In other words, some banks do not see a strong “business case” for electronic check collection, and this has clearly been a stumbling block for many organizations over the years. Several groups have already done work to identify costs and benefits. To follow up on suggestions that there may be a need for further work, the committee will seek additional views from the banking industry about the best approach to deal with these issues. Ultimately, however, each financial institution must decide what is best for that organization, and the Federal Reserve can serve as a facilitator for discussions, if needed. Speaking of the information one might need to create a business case and plan business investment, I feel compelled to note here that we do not really know how many checks are written in the United States each year - information that might be helpful to those interested in automating or replacing check payments. The Federal Reserve is in the process now of planning to collect data from which to help estimate the annual volume of check payments and their value. We are counting on the assistance of the banking industry in this endeavor. Conclusion Overall I have a sense that new energy is flowing into efforts to improve the retail payments system. The fact that check, cash and credit cards are likely to be with us for some time should not blind us to the changes that are occurring. I believe that new technology, changes in the banking laws, and old-fashioned competition are producing change. Some believe that we may see revolutionary change. Several newer retail payment mechanisms will be added to the existing ones, but the history of automating the retail payments system cautions that evolution is more likely than revolution. The Federal Reserve is actively engaged with the private sector in discussing changes in the payments system. We need to be alert to help remove barriers to innovation, including regulatory barriers, when this is in the public interest. At the same time, new payment arrangements need to address traditional payments system risks in a responsible manner and not wait until problems tarnish innovative thinking. Finally, I continue to look for a market-oriented approach to payments system innovation that will provide long-lasting benefits to the consumers and businesses that use the US payments system. True innovations frequently disturb comfortable habits. Thus we need to approach payments system innovations with an open mind and a willingness to learn. This is particularly true in the world of electronic commerce, where payments are being adapted to new technologies, products and methods of doing business. These innovations are important in themselves. But they are also important because successful innovations to support electronic commerce may, over the long term, have a broad influence on the payments systems we use throughout our economy.
|
board of governors of the federal reserve system
| 2,000 | 10 |
Remarks by Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the 18th Annual Monetary Conference, "Monetary Policy in the New Economy", at the Cato Institute, in Washington DC, on 19 October 2000.
|
Alan Greenspan: Challenges for monetary policymakers Remarks by Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the 18th Annual Monetary Conference, “Monetary Policy in the New Economy”, at the Cato Institute, in Washington DC, on 19 October 2000. * * * For the past two decades, central bankers have largely been successful at subduing the inflationary pressures that threatened to upend market-oriented economic systems a generation ago. The Federal Reserve, during this period, has squeezed out an inflationary excess of liquidity. In Europe, tolerance for inflation has declined dramatically, and a commitment to price stability - long the hallmark of Bundesbank policy - has gained widespread acceptance. In the past five years or so, two new trends have greatly assisted this process in the United States. One has been the serendipitous emergence of a once- or twice-in-a-century surge in technology. Without the accompanying boost to productivity, our progress toward price stability might well have been marked by the social pressures that arose in many previous episodes of disinflation both here and abroad. The other trend was the growing political commitment to address an outsized federal budget deficit that was absorbing an inordinate share of our national saving. That change in our political environment - accompanied by the faster pace of technology-driven growth - has resulted in a sharp reduction in the unified budget deficit and, more recently, the emergence of a significant surplus. This, in turn, has helped fill the pool of saving that has fed productivity-enhancing and cost-reducing capital formation. A number of elements came together to make this economic configuration possible. The essential ingredient was the public support that developed as the growth-inhibiting consequences of the inflation of the 1970s became manifest. Without this public recognition and support, it would have been difficult to establish the stable macroeconomic foundation upon which the private sector has built so productively in recent years. By now, the story of the boom in information technology is well known, and nearly everyone perceives that the resulting more rapid growth of labor productivity is at least partly enduring. Capital deepening has surged during the past seven years, and innovations, synergies and networking effects have boosted significantly the growth of multifactor productivity. With output per hour having accelerated, cost pressures have been patently contained. For the most part, the Federal Reserve generally recognized these changing fundamentals and calibrated American monetary policy accordingly. Although we have learned much about managing the financial backdrop to accelerating economic activity, it is essential that we not be deluded into believing that we have somehow discovered the Rosetta stone of monetary policy. A failure by policymakers to sufficiently appreciate the inevitable uncertainties that they confront could result in unfortunate consequences for the economy. While we central bankers do not have full knowledge, we have continued to gain insight - albeit slowly - into how market economies function. That learning process has been aided, especially in the United States, by a vast panoply of data and information from both public and private sources. However imprecise, these readings on the economy have helped us steer monetary policy through an inevitably uncertain future. In practice, it is the joining of ideas and data that drives policy in the face of uncertainty. We seek to array the probabilities of future policy outcomes, attempt to gauge the costs of being wrong in our decisions, and endeavor to choose the policy paths that appear to offer greater benefits and fewer risks. Whether we choose to acknowledge it or not, all policy rests, at least implicitly, on a forecast of a future that we can know only in probabilistic terms. Even monetary policy rules that use recent economic outcomes or money supply growth rates presuppose that the underlying historical structure from which the rules are derived will remain unchanged in the future. But such a forecast is as uncertain as any. This uncertainty is particularly acute for rules based on money growth. To be sure, inflation is at root a monetary phenomenon. Indeed, it is, by definition, a fall in the value of money relative to the value of goods and services. But as technology continues to revolutionize our financial system, the identification of particular claims as money, near money, or a store of future value has become exceedingly difficult. Although it is surely correct to conclude that an excess of money relative to output is the fundamental source of inflation, what specifically constitutes money is a notion that has, so far, eluded our analysis. We cope with this uncertainty by ensuring that money growth, by any reasonable definition, does not reach outside the limits of perceived prudence. But we have difficulty defining those limits with precision, and within any such limits there remains significant scope for discretion in setting policy. In history, discretionary monetary policy, of course, has not been without its shortcomings, and the dominant force of accelerating productivity has made our current task of policy calibration especially daunting. Policymakers, in fact all forecasters, invariably construct working hypotheses or models of the way our economies work. Of necessity, these models are a major simplification of the many forces that govern the functioning of our system at any point in time. Obviously, to the extent that these constructs, formal or otherwise, fail to capture critical factors driving economic expansion or contraction, conclusions drawn from their application will be off the mark. In practice, we continuously monitor a substantial quantity of incoming data to test the capability of any specific model or hypothesis to explain actual outcomes. When we experience outcomes that do not square with what our models have projected, we form new hypotheses and reconstruct the models to more closely conform with our observations. With the virtually unprecedented surge in innovation that we have experienced over the most recent half decade, many of the economic relationships embodied in past models no longer project outcomes that mirror the newer realities. Data series that better measure the workings of the so-called new economy are under development. But we still have far more information on the variety of yarns and weaves produced by textile establishments than data on output of the burgeoning software industry or many of the other rapidly growing high-tech industries. The paucity of data for the latter inhibits our ability to fully test our working hypotheses or models in order to detect changes in economic relationships as quickly and confidently as we would like. Evidence began to accumulate in the early and mid-1990s that prospective rates of return on capital were rising. This was implicit both in the marked rise in investments in high-tech equipment and in the updrift in estimates of the growth of long-term earnings by corporate management, which were reflected in the projections of securities analysts. Nevertheless, we could not be certain whether what we were observing was a short burst of productivity gains or a more sustained pickup in productivity growth. The view that we were experiencing a sustained pickup gained plausibility when productivity growth continued to increase as the expansion lengthened. But importantly, only after we could see evidence in other economic behaviors and in readings from asset markets that were consistent with accelerating productivity did we begin to develop confidence in our analysis. When confronted with a period of structural change, our policy actions must be based, in large part, on identifying emerging trends from surprises and anomalies in the data and then carefully drawing their implications. It would be folly to cling to an antiquated model in the face of contradictory information. Some who question the economic implications of the spread of innovation and the step-up in productivity growth hypothesize that the gains are largely confined to the so-called new economy, with little effect on efficiency in the old economy. But this notion fails to capture the dynamics of the marketplace. To be sure, a significant segment of our economy’s growth reflects output of high-tech equipment. Moreover, the long-term prospective profit growth of those firms engaged in the computer and telecommunications industries has been revised up during the past five years by more than double the amount for the so-called old-economy industries. But in a meaningful sense, there is, with few exceptions, little of a truly old economy left. Virtually every part of our economic structure is, to a greater or lesser extent, affected by the newer innovations. No old-economy textile plant could exist in today’s environment without technologies that Edmund Cartwright could never have imagined. There are, of course, significant differences in the degree to which the newer technologies have been applied. However, almost all parts of our economy have shared to some extent in the benefits of this wave of innovation. So it is with some irony that just as we are adapting our old-economy models to the new realities, in drifts an apparition from the past - a spike in oil prices that has potential implications for economic stability and for monetary policy. The reemergence of oil prices as an important macroeconomic consideration is a reminder that there is less of a stark division between old and new economies than is often loosely suggested. Even the oil industry, a presumed old-economy stalwart, is a surprisingly major player in the new. As a consequence, policymakers will need to consider how changes in technologies and world markets may have altered the response of our economies to oil-price shocks and, thus, how best to respond to them. Any evaluation of the current oil-price spike, if indeed that is what it is, will require us to delve into the forces of new technology and to understand how significantly the impact of oil on our economy’s workings has changed with a generation of innovation. Largely in response to past oil price increases, the energy intensity of advanced industrial economies has been reduced by half from the levels of the early 1970s. This, of course, does not mean that the effect of any given oil-price increase is half that of a generation ago, because the size of the effect also depends on the potential to reduce energy consumption as prices rise and on the degree to which other energy sources can be substituted for oil as relative prices change. Most of the oil displacement was accomplished by 1985, within a few years of the peak in the relative price of energy. Progress in reducing energy intensity generally has been far more modest in the last 15 years - not surprising, given the generally lower level of real oil prices after 1985. What has changed dramatically in recent years is the production side of oil, which is likely to be a substantial factor in determining the extent and duration of any oil-price spike. Oil, in this regard, has truly become at least an associate member of the new economy. The development of seismic techniques and satellite surveillance to discover promising new oil reservoirs has more than doubled the drilling success rate for new field wildcat wells during the past decade. New techniques and the use of oil rigs that have more computer chips in them than most modern office buildings facilitate far deeper drilling of promising pools, especially offshore. The newer recovery techniques reportedly have raised the proportion of oil reserves eventually brought to the surface from a third to nearly a half in recent decades. One might expect that, as a consequence of what has been a dramatic change from the old hit-or-miss wildcat oil exploration and development of the past, the cost of developing new fields and, hence, the long-term marginal cost of new oil have declined. And, indeed, these costs have declined, but by less than might otherwise have been the case; much of oil development innovation outside OPEC has been directed at overcoming an increasingly inhospitable and costly exploratory environment, the consequence of more than a century of draining the more immediately accessible sources of crude oil. One measure of the decline in recent years of the marginal cost of additions to oil availability is the downdrift in the prices of the most distant contracts for future delivery of Light Sweet crude oil. Spot prices have soared and plunged, but for the most distant futures contracts - which cover a time frame long enough to seek, discover, drill and lift oil - prices generally moved lower over the past decade. The most distant futures prices fell from a bit more than $20 per barrel just before the Gulf War to $17 to $18 a barrel a year ago. The current six-year futures contract has risen over the past year to about $21 per barrel. Arguably, however, this rise is related less to technology and the structure of underlying marginal costs and more to concerns about how quickly new practices will be exploited to expand OPEC’s productive capacity. Having increased output significantly during the past year, only a little excess capacity remains. Moreover, going forward, there is concern that OPEC may choose not to expand capacity adequately from their large proven reserves. The long-term marginal cost of extraction presumably anchors the long-term equilibrium price and, thus, is critical to an evaluation of the magnitude and persistence of any current price disturbance. Over time, spot prices are inexorably drawn back to the long-term equilibrium price, as the balance between underlying supply and demand is restored. But in the short run, the price of oil, as that of all commodities, inevitably is influenced importantly by inventory levels, especially when stocks become critically short. Over the years, innovation and consolidation have significantly reduced the operating inventories of crude oil and products required to service a given level of product demand. Excluding these operating stocks, the world oil industry has historically run on a rather thin buffer, roughly a 13-day inventory of “usable stocks”, according to data from the Energy Intelligence Group. Thus, when OPEC and its allies cut back oil liftings by more than 3 million barrels a day in 1998 and 1999, against the backdrop of expanding world demand, the buffer of more than 15 days of usable oil that prevailed at the time rapidly shrank to a little more than 10 days. As a consequence, the price of crude oil tripled. OPEC has since more than restored its output, bringing world production now to record levels - levels that exceed reported demand by nearly 1½ million barrels a day, seasonally adjusted. Inventories of usable stocks are building but have yet to regain normal levels. But the remarkable surge in shipments of home heating oil, both here and in Europe, suggests very heavy precautionary oil accumulation by dealers and households not covered in the usual data on inventories for the industry. It would certainly seem that, with inventories building and the spot price of crude oil well above its long-term equilibrium, spot prices would shortly be under significant downward pressure. However, concerns about the potential for political difficulties to impinge on available supplies persist, as has been so evident over the past few weeks. Even before the recent unfortunate developments in the Middle East, demand to augment buffer stocks surged, which has helped to keep prices high. This owed largely to the possibility of a politically driven removal of a significant part of Iraq’s 2½ to 3 million barrels a day from global markets at a time when there exists so little available world excess capacity to replace it. Even though the intensity of oil consumption is markedly below where it was thirty years ago, it still has the potential to alter the forces governing economic growth in the United States. To date, the spillover from the surge in oil prices has been modest. Any effect on inflation expectations, at least as inferred from the behavior of long-term Treasury Inflation-Indexed Securities, has been virtually nil. Moreover, despite some slowing that likely has been related in part to the bite from the so-called “oil tax” on household incomes, the growth of consumer spending has remained firm. But policymakers will need to be on the alert for oil-driven, indeed energy-driven, risks to our expansion. Looking further ahead, some of those favorable factors that I discussed earlier - in particular, growing fiscal surpluses and accelerating productivity - remain in place, but presumably will not persist indefinitely. The mounting fiscal surpluses have been an important source of national saving, muting upward pressures on interest rates at a time of strong demand for private credit. By keeping the cost of capital lower than it otherwise would have been, the surpluses have contributed to capital deepening and faster growth of productivity. But I believe most of us harbor doubts about whether the dynamics of the political process, some of which have been on display in the current budgetary deliberations, will allow the surpluses to continue to grow. Moreover, as I indicated in remarks at Jackson Hole in August, a tapering off of productivity acceleration is inevitable at some point in the future. When that occurs, even should the growth rate of productivity remain high, we could experience less benign readings on cost pressures if the tightness of labor markets persists. That said, as best we can judge, credible evidence that the rate of structural productivity growth has stopped increasing is still lacking. It is the continuing acceleration of productivity that, by allowing businesses to absorb rising compensation increases without incurring ever rising unit costs, has been so essential to containing price increases. We are observing some remarkable structural advances in our economy. Central bankers have learned much about their implications. But, it will be essential for this learning process to continue if we are to recognize and respond effectively to the inevitable surprises placed in our path by a constantly evolving and highly dynamic economy.
|
board of governors of the federal reserve system
| 2,000 | 10 |
Remarks by Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the Financial Services Conference 2000, St. Louis University, St. Louis, Missouri on 20 October 2000.
|
Roger W Ferguson, Jr: Information technology in banking and supervision Remarks by Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the Financial Services Conference 2000, St. Louis University, St. Louis, Missouri on 20 October 2000. * * * I am pleased to speak with you today on technological innovation in the financial services sector. As you may know, technology and its impact have been key areas of focus at the Federal Reserve in recent years. As my colleagues on the Federal Reserve Board have often noted, technological innovation affects not just banking, financial services, and regulatory policy, but also the direction of the economy and its capacity for continued growth. Some argue that dramatic structural changes are in store for the financial services industry as a result of the Internet revolution; others see a continuation of trends already under way. What is clear is that the last few years have seen a truly phenomenal pace of new technology adoption among even the most conservative banking organizations. A number of financial trade publications are now devoted almost entirely to emerging technologies and the latest financial technology ventures. We know that many banks are making what seem like huge investments in technology to maintain and upgrade their infrastructure, in order not only to provide new electronic information-based services, but also to manage their risk positions and pricing. At the same time, new off-the-shelf electronic services, such as on-line retail banking, are making it possible for very small institutions to take advantage of new technologies at quite reasonable costs. These developments may ultimately change the competitive landscape in financial services in ways that we cannot predict today. Technology is also changing the supervisory and regulatory landscape. It is creating new tools for supervisors and new supervisory challenges. Technology-driven issues such as privacy and the nature of electronic communications have reached the forefront of the policy agenda. And the line between electronic banking and electronic commerce is becoming more difficult to define clearly. I would like to explore more deeply a few of these issues in my remarks today. Technology investments More than most other industries, financial institutions rely on gathering, processing, analyzing, and providing information in order to meet the needs of customers. Given the importance of information in banking, it is not surprising that banks were among the earliest adopters of automated information processing technology. The technological revolution in banking actually began in the 1950s, well before it began in most other industries, when the first automated bookkeeping machines were installed at a few US banks. Automation in banking became common over the following decade as bankers quickly realized that much of their labor-intensive, information-handling processes could be automated on the computer. A second revolution occurred in the 1970s with the advent of electronic payments technology. Recognizing the importance of information security, the financial services industry during the late 1970s and early 1980s was also the first to implement encryption technologies on a widespread basis. The euphoria surrounding the Internet today seems very similar to that era, when the first nationwide credit card and electronic funds transfer systems were built. As we could in earlier decades, we can identify three main reasons financial institutions are investing in technology. First, as in the 1950s and 1960s, they anticipate reductions in operating costs through such efficiencies as the streamlining back-office processing and the elimination of error-prone manual input of data. Second, institutions see opportunities to serve their current customers and attract new customers by offering new products and services as well as enhancing the convenience and value of existing products and services. Third, with more powerful data storage and analysis technologies, institutions are able to develop and implement sophisticated risk- and information-management systems and techniques. While in hindsight it is clear that many of the earlier investments met those objectives, it is unclear whether today’s most highly touted investments have done so, or will do so in the future. For example, the rush to set up Internet banks of a few years ago seems to have slowed, tempered by the experience of the few pioneers in this area, who found that although technology risks and hurdles are surmountable, the basic imperative of making a profit is often not. Smart cards are another example of an innovation that, although widely heralded several years ago as the next new personal banking device, has yet to be proved a convenient substitute for currency and coin. Overall, the impact of the current technology investment boom in the financial services sector is difficult to assess. We know that productivity in financial services, like productivity in the rest of the service sector, is very hard to measure. The problem is due partly to the difficulty of measuring output accurately when the quality of service is changing as a result of such factors as greater convenience and speed and lower risk. Measuring output in the financial services sector is particularly controversial because so many services, such as deposits, provide services directly to customers and at the same time fund loans. Moreover, measuring the inputs used to produce outputs is difficult. We have not, for example, traditionally required from financial institutions, as part of the supervisory process, any reporting of technology-related investments and expenditures. Lack of consistent data significantly limits systematic industrywide or peer group analysis by supervisors or economic researchers that would shed light on some of these questions. As I consider the very recent, admittedly mixed, experience of the financial services sector with technologies - looking at the examples of Internet banking, on-line banking, smart cards, and ATMs - it seems that several lessons emerge. First, many of the investments have been made to automate existing processes, but the challenge of fundamentally rethinking the process from start to finish - the so-called core process redesign that is necessary to reap the full benefit of the current generation of technologies - has proved daunting. This is in part because many of the services that banks are attempting to automate currently are “joint goods”, that is, the production and consumption of the product or service depend on the inputs or behaviors of many players outside of the bank and even outside of the financial industry. For example, the flow of services from checks depends on a complex of economic actors, including consumers willing to write checks, merchants willing to accept them, and an infrastructure in place to clear and settle them. Attempting to automate part of the check process by imaging or to replace checks with a single instrument, such as the debit card, requires cooperation among all the organizations that support a checking transaction. Internet banks are another example of these interdependencies. Many Internet banks have discovered that they are using any savings in “brick and mortar” operating costs to pay “bounties”, or fees, to other Internet sites that refer new customers and to operate call centers to field the customer inquiries that invariably arise. Another lesson from the history of technology in banking is that so many of the costs in banking are shared across products, and even across customers. Therefore, an investment that might have a positive impact on one customer base or product may not have the desired impact on the overall cost base. I believe that the early history of ATMs illustrates this lesson. The ATM was originally introduced as a way to reduce costs of the branch network. Although the ATM succeeded in moving small-value withdrawal transactions from branches, that accounted for only a portion of the customers served and the transactions performed by a branch network. Therefore, early ATM networks added cost without substituting for branch networks. For ATMs to become truly economically attractive, they had to evolve to offer a fuller range of products for a greater proportion of bank customers. Indeed, ATMs now offer more services and more locations, and they have started to make a positive return on investment. The third possible lesson from the history of technology in banking is that banking services may be a class of services for which demand and supply interact so that new supply creates additional demand. Clearly, creating different channels for retail access to banking services, such as branches, PC banking, phone banking, ATMs, and the Internet, has neither significantly reduced the demand for any of those channels nor led to significant bank cost savings. This situation may in part reflect banks’ reluctance to use pricing as an incentive for customers to change their behavior and move to newer technologies. However, it may also reflect the fact that the increased convenience of these different channels is simply translating into a permanent increase in consumer welfare and not necessarily into a permanent increase in revenue or a permanent reduction in costs. In this regard, banks that are not early adopters will admit privately that their investments in new technologies for customer access are largely defensive measures. New channels, such as on-line banking, are not generally leading to increases in the customer base at banks that offer them; instead, customers (particularly the most sophisticated who have ready access to technology) have begun to expect these services and may readily switch providers if their expectations are not met. Thus, banks have recognized that they need to offer the conveniences of newer technologies merely to retain their existing customers. Federal Reserve research has found an interesting caveat to the above statement: Banks that either are early adopters of new technologies or are particularly effective at using such technologies do have temporarily higher revenues but do not have cost savings. Revenue enhancements are the foundation of higher profitability. The elevation of profitability is expected to be temporary, however. As others adopt similar technologies, rates of return on new investment fall, and profitability for all banks returns to normal. The net result is an increase in consumer welfare but, as I have just stated, not a long-term reduction in cost or a long-term increase in profitability. The fourth lesson is that the mixed effect of technology in banking more recently may simply reflect the fact that technology can replace relatively simple, repetitive functions, such as the basic calculations and internally oriented back-office support functions that were automated initially. But so much of banking still involves higher-level judgments. These are judgments that can be informed by the types of computations performed by computers, but ultimately they cannot be made by computers. Risk management, reserving policy and underwriting larger C&I loans are, it appears, areas in which technology is an important adjunct to the judgment of experienced managers but ultimately is not a replacement for the experience a banker brings to the undertaking. This is reflected in the fact that risk modeling seems to be further advanced for market risk than for credit risk. We know that investments in newer technologies must be made to modernize existing operations, to face competitive challenges, and to meet customer expectations. Indeed, some of these investments will also be made in the hope of achieving cost savings and other efficiencies. However, I would suggest that bank management needs to enter these investments recognizing that the full benefits may not be gained quickly; may, if gained, be competed away; and may, indeed, not be captured at all. History teaches that costs may emerge long before expected revenues, and that operational risk can either decrease or increase as a result of making major technology investments. As I will emphasize in a moment, bank managers would be wise to monitor carefully the progress of large technology projects, marking major milestones clearly and holding technology management accountable. Given the size, complexity, and business risk of many modern technology investments, these investments clearly should be a top management interest and are a top management responsibility. Technology in banking supervision Technology also plays a key role in the Federal Reserve’s longer-term process of modernizing its approach to banking supervision and regulation. I will briefly touch on a few examples. The Federal Reserve and other banking supervisors are reviewing our processes and policies to make sure we are adequately prepared to fulfill our supervisory responsibilities. Part of this task involves better understanding the role and risks of technology in banking organizations. The Year 2000 experience was instructive to many within the supervisory community on the importance of technology to financial business processes. We are attempting to preserve the lessons we learned by integrating technology considerations into our ongoing supervisory process in several ways. For example, traditionally, the Federal Reserve and the other federal banking agencies have conducted separate reviews of information technology operations and had assigned these activities separate examination ratings (similar to CAMELS ratings). Earlier this year, we decided to merge these reviews into the mainstream bank supervision process. Like banks, examiners must learn to consider how information technology affects the bank’s financial risks and results, rather than treating it as a separate function. The privacy provisions of the Gramm-Leach-Bliley Act have also made information security a priority for supervisors. Although we have always reviewed information security as part of the supervisory process, the new law requires us to set consistent expectations for all institutions. Attaining the appropriate balance in assessing technology operations within the supervisory process is not a simple matter, however. For example, over the last year, the number of banks supervised by the Federal Reserve that are offering banking services to their customers over the Internet has more than doubled. As supervisors, we recognize that this kind of sudden change can lead to risks. We are developing training for our examiners on how to review Internet banking operations, and maintaining a sufficiently up-to-date knowledge base will be a constant challenge for supervisors. However, we need to avoid the temptation to view electronic banking and other technology related operations as a new business line or risk area for which we need to develop a whole new supervisory or riskmanagement framework. And to date, these operations remain a relatively small part of most banks’ operations, and we have not seen them generating higher levels of supervisory concerns. Despite our more integrated view, it is important to recognize that supervisors cannot be responsible for ensuring that the technology employed by financial institutions always works exactly as expected. In fact, there are many technology related risks and pitfalls that are rightfully the concern of a financial institution’s shareholders, but not necessarily of its supervisors. I do not see bank supervisors hiring legions of network engineers to advise banks on which firewall or encryption technology to use. Even if we felt that a detail-oriented technical approach was warranted, our public-sector resources simply could not support it. Moreover, it is not clear that this type of approach would be consistent with our increased supervisory focus on banks’ risk-management processes and control infrastructure rather than on conducting detailed technical reviews. In fact, most of the technology related issues we have encountered as supervisors, such as problems integrating disparate systems that invariably arise in bank mergers, are not the result of inadequate technology, but of inadequate planning or project management. The Federal Reserve has reviewed more than a dozen bank holding company applications over the last year or so that have involved Internet ventures. We have found that when supervisory issues were identified, they generally involved managerial or financial concerns rather than concerns about the viability, reliability, or security of the technology. What this suggests to me is that the core risks and core competencies of banking and financial services will change only gradually while banks find new ways of reaching customers through new technology. The international regulatory community has also increased its focus on these issues. As you may know, there is currently a project under way in the Basel Committee on Banking Supervision to evaluate capital requirements for operational and other risks. The process is difficult, both conceptually and in practice. There is little industry consensus about what should or should not be included in the definition of operational risk and about how best to allocate capital. Operational failures and occasional financial losses are routine events that in many cases can be incorporated into the pricing of services. We do not yet have a good handle on what portion of operational loss events is expected or unexpected. Many larger banks are now developing their own models to measure and estimate operational risk and to allocate capital to cover this risk. The Federal Reserve is currently developing new approaches to assessing operational risk in our supervisory process, but a common view on this topic is probably several years away. Finally, in our day-to-day supervision of banking organizations, US regulators have recognized the need for an ongoing, more risk focused approach, particularly for large, complex, internationally active banks. We need to stay abreast of the nature of their activities and of their management and control processes. A continuous flow of information helps examiners tailor on-site reviews to the circumstances and activities at each institution, so that our time is well spent understanding the bank’s management process and identifying weaknesses in key systems and controls. Throughout this process, the Federal Reserve is looking at areas in which we can use technology to perform our supervisory responsibilities more effectively. We are implementing a number of automated tools for examiners to use for gathering and analyzing information that can aid the supervisory process without burdening the institutions we supervise. Legal and regulatory distinctions Just as the Federal Reserve is modernizing its approach to supervision, so too our banking laws are being modernized. While these laws have historically ensured the separation of banking and commerce in this country, the information-based nature of electronic commerce and its close relation to electronic financial services is challenging this distinction. Traditionally, banks have largely been technology users or buyers. Today, some financial organizations aspire to emulate technology companies, both by participating in the development of new technologies for financial products and services and, potentially, by earning the kind of capital markets support that we have seen for technology companies in recent years. Whether this fit will be harmonious is yet to be seen. Nevertheless, it is imperative that we modernize our approach to traditional banking restrictions. The Federal Reserve and state and national chartering authorities have begun to consider the range of electronic commerce activities that financial institutions may operate or own. We recently issued a proposal on “finder” activities that would allow financial holding companies to provide, or invest in, services that bring together buyers and sellers of nonfinancial products. These activities would encompass, for example, hosting an Internet marketplace or operating an Internet auction site. We are considering other areas of technology and electronic commerce activities that would be considered permissible for financial organizations while still preserving the core distinctions between banking and commerce. Conclusion In conclusion, we can expect financial institutions to continue experimenting with new technologies and electronic, information-based services. I believe that this is an area with great potential, yet the uncertainties are large and the payoff horizon is unknown. Banks and supervisors need to recognize that it is acceptable - and even expected - to make some investments that do not pay off. We also know there have been, and will continue to be, technological glitches - computers and web sites go down occasionally, and e-mail gets lost. The new Internet world is a punishing one for these routine mistakes, and financial institutions have strong incentives to take precautions and to fix problems well before they reach supervisors’ and policymakers’ attention. The information-based nature of financial services is unlikely to change. I am confident that banks and other financial institutions will continue to find new and better ways to put technology to their and their customers’ best use, and that they will manage the technology and the business risks associated with these investments.
|
board of governors of the federal reserve system
| 2,000 | 10 |
Remarks by Laurence H Meyer, a member of the Board of Governors of the US Federal Reserve System, at the Century Club Breakfast Series, Washington University, St. Louis, Missouri on 19 October 2000.
|
Laurence H Meyer: The economic outlook and the challenges facing monetary policy Remarks by Laurence H Meyer, a member of the Board of Governors of the US Federal Reserve System, at the Century Club Breakfast Series, Washington University, St. Louis, Missouri on 19 October 2000. * * * When I’m asked what my profession was prior to joining the Board of Governors, I do not say that I was an economic forecaster, but rather a storyteller. As a forecaster, I had learned that neither my students nor my clients wanted to be buried in reams of computer output. They wanted me to tell a story that brought together in a coherent way the implications of the large number of economic indicators they saw as otherwise unconnected and therefore confusing. Since joining the Board, I have tried to continue this approach, focusing in particular on the forces behind the extraordinary macroeconomic performance over the past several years and their implications for the outlook and monetary policy. My story this morning has five chapters on how the economy and monetary policy have adjusted, and must continue to adjust, to the acceleration of productivity and to the oil and other relative-price price shocks that have been so important in shaping macroeconomic performance since the mid-1990s. In Chapter 1, I identify the short-run effects of higher productivity growth - specifically, the effects on aggregate demand and on inflation - and assess the implications of these two effects for the conduct of monetary policy. In Chapter 2, I note the favorable choice that confronts policymakers as the economy adjusts to an acceleration in productivity - the choice between temporarily lower unemployment, lower inflation, or some combination - and explain how the response of monetary policy determines this choice. That leads me directly to Chapter 3 and my interpretation of recent monetary policy strategy. In Chapter 4, I discuss a transition to a period of slower growth and possibly higher inflation, a transition that in all likelihood we will make at some point. I interpret this as part of the continuing adjustment to the acceleration in productivity growth and part of the convergence to sustainable utilization and inflation rates. In my final chapter, I turn to how the swings in oil prices have interacted with the acceleration in productivity to shape recent economic performance. The effect of both of these shocks on inflation could reverse, and the balance between them could then turn out to be a powerful influence on the performance of the economy. The story I am sharing with you this morning is mine. I am not speaking for the Federal Open Market Committee or the Board of Governors. In addition, there may be reasonable questions about whether this story should be classified as fiction or nonfiction. Only time will tell. But it is fair to say that the views presented here are not part of the conventional wisdom, in part because a consensus has not yet been reached about the short-run effects of an acceleration in productivity. It will take the benefit of historical perspective and additional data to achieve such a consensus. But policy has to be made in real time. It is in that spirit that I offer these observations. 1. The short-run effects of an acceleration in productivity growth We do not have many opportunities to learn about how the economy responds to major accelerations or decelerations in productivity. Figure 1 depicts the pattern in trend productivity growth from the 1960s to the first half of 2000. Trend productivity growth is an estimate of how productivity growth would evolve in the absence of cyclical swings in output and employment. As you can see in Figure 1, trend productivity growth in the early 1970s slowed from a 3 percent rate over the previous decade to about a 1-1/2 percent pace and remained at this lower rate until the mid-1990s. At that point, productivity sharply accelerated. The figure undoubtedly overstates the abruptness of the decline in the early 1970s and the constancy of trend productivity on either side of this point and oversimplifies the pattern of acceleration since the mid-1990s. In Figure 1 the pattern of acceleration is simply a straight line from the earlier period of low productivity growth to an upper-end estimate of trend productivity today. A more sophisticated estimate of the trend is given in Figure A.1, in the appendix. But the important message in both figures is that there have been two dramatic and persistent changes in trend productivity growth during the postwar period - the deceleration in the early 1970s and the acceleration in the mid-1990s. Economists don’t agree yet on how much productivity has accelerated. Estimates among private forecasters of trend productivity growth cluster in a range of 2-1/2 percent to 2-3/4 percent with the upper end in the neighborhood of 3 percent to 3-1/2 percent. The upper-end estimate would be consistent with an increase in potential output growth to about 4 percent to 4-1/2 percent, compared with potential output growth of about 2-1/2 percent prior to this acceleration. The principal sources of the recent acceleration in productivity appear to be the information technology revolution, the related more rapid decline in the relative price of computers and communication equipment, and the resulting surge in capital spending that has raised the growth of capital services relative to that of labor. Figure 1 Trend productivity growth Because we have little experience with major and persistent decelerations or accelerations in trend productivity, economists have not arrived at any consensus on how the economy responds to them. In the case of a sustained increase in productivity growth, it is obvious that the long-run effects include a higher sustainable rate of growth in output. Economic theory also suggests that, all else equal, the equilibrium real interest rate will increase. But there is less agreement about the shorter-run effects of an acceleration in productivity. The events of the past few years have provided an opportunity to learn more about them. This experience, in my view, points to two key short-run macroeconomic consequences of an acceleration in productivity: the demand effect and the direct disinflationary effect. I should note that both of these effects remain as open, empirical questions.1 Although these influences may be temporary, they can dominate the economic outlook for a number of years following an acceleration in productivity. While the persistent effect of acceleration in productivity is a higher rate of sustainable growth, the short-run effects appear to have added a temporary bonus that, in my view, has contributed importantly to the truly exceptional performance over the last several years. Specifically, they added to the increase in growth in real activity and were the source of simultaneous declines in unemployment and core inflation during much of this period. These developments have created a powerful set of crosscurrents that challenge monetary policymakers. The demand effect refers to the stimulus to aggregate demand from the forces that underlie the acceleration in productivity. The information technology revolution, for example, has set off an investment boom. The boom was triggered by the profitable opportunities associated with exploiting the new technologies, by the associated declines in the relative prices of high-tech equipment, and by the decline in the cost of financing high-tech investments as a result of higher equity prices. The information technology revolution also set off a consumption boom triggered by the wealth effect associated with higher equity prices and by a projected higher path of labor income. Recent experience suggests that this surge in demand may, at least for a while, overwhelm the increase in potential output growth, resulting in a steady decline in the unemployment rate. At the same time, an acceleration in productivity also appears to have a direct disinflationary effect. This arises, in my view, because of an asymmetry in the response of wages and prices to the acceleration in productivity, perhaps because salaries and wages are typically adjusted only once a year or, perhaps because of wage norms that cause employees to expect compensation growth similar to the experience over previous years. Whatever the reason, nominal wages initially do not appear to respond very much to the acceleration in productivity. As a result, the higher productivity lowers the cost of production. This has the immediate effect of raising profits, but competition soon puts downward pressure on prices, lowering inflation. The lower inflation, in turn, restrains nominal wage demands, contributing for a time to a virtuous cycle of lower wage change and price inflation. In the appendix, I present a simple model that incorporates this asymmetric response. The demand and direct disinflationary effects have conflicting implications for inflation. The abovetrend growth associated with the demand effect lowers the unemployment rate, and the tighter labor market puts upward pressure on inflation. The direct disinflationary effect, on the other hand, lowers inflation for any given unemployment rate. The net effect could be steady, rising, or falling inflation - depending on whether productivity continues to accelerate and on how low the unemployment rate is driven in the process. And, of course, the appropriate course of monetary policy depends on the balance between these two effects.2 The interaction of these two short-run effects can also be explained in terms of the relationship between the unemployment rate and the nonaccelerating inflation rate of unemployment (NAIRU). The NAIRU is the unemployment rate consistent with steady inflation. In computing the NAIRU, the customary practice is to abstract from shocks that directly affect inflation over and above the influence of demand pressure - for example, accelerations or decelerations in trend productivity and swings in the relative price of oil are not taken into consideration. In fact, if there were no shocks, the direction of inflation ultimately would be uniquely determined by the relationship between the unemployment Reflecting the absence of a consensus about the short-run effects of an acceleration in productivity, most macroeconomic models do not fully incorporate either of these effects. The model used for policy analysis at the Board, the FRB-US model, however, explicitly incorporates both of them. There are other complications. For example, the acceleration in productivity has contributed to the swing from deficit to surplus in the federal budget and appears to have encouraged an appreciation of the dollar. These effects in turn influenced the overall response of demand and inflation to the acceleration in productivity. rate and the NAIRU. The NAIRU constructed in this way is best thought of as being a long-run value that is relevant once the economy has fully adjusted to any shocks or in the absence of such shocks. But, of course, additional shocks almost always directly affect inflation, most often on the supply side of the economy. One, therefore, must take into account both the demand pressures captured by the relationship between the unemployment rate and the NAIRU and supply shocks. Alternatively, one could derive an alternative measure of the NAIRU that took into account the supply shocks. By its nature, such an adjusted NAIRU would be more of a short-run concept. Because an acceleration in productivity initially lowers inflation for any given unemployment rate, it also lowers the unemployment rate consistent with steady inflation in the near term. That is, the direct disinflationary effect lowers the short-run NAIRU relative to its long-run value. Whether or not inflation will rise or fall in the immediate aftermath of an acceleration in productivity cannot be judged therefore by comparing the actual unemployment rate to the estimate of the long-run NAIRU. The relevant comparison is between the current unemployment rate and an estimate of the short-run NAIRU that takes into account the disinflationary effect of the productivity shock. My preference is to estimate a short-run NAIRU that directly takes into account the disinflationary effect of an acceleration in productivity because the effects of an acceleration in productivity on inflation may persist long enough that the adjusted NAIRU might be useful in policy decisions. On the other hand, I would leave out of such an estimation the effect of swings in oil prices, for example, because their effect on inflation dissipates more quickly. The demand and direct disinflationary effects are both temporary, but the period over which each has its influence depends on different considerations. The persistence of the demand effect depends on how long it takes for business capital stocks to adjust to higher expected profitability, for consumer durables to adjust to higher wealth and higher projected future income, and for market interest rates to close the gap relative to the higher equilibrium real interest rate. The direct disinflationary effect of an acceleration in productivity is, in my view, temporary because it likely arises from the lag in the adjustment of nominal wages to the productivity acceleration. It dissipates gradually once productivity growth stabilizes and nominal wages catch up to the productivity acceleration. How rapidly this effect dissipates depends on how quickly productivity growth stabilizes and how rapidly nominal wage gains adjust to the higher productivity growth. A similar sluggish adjustment of nominal wages to productivity developments following the productivity slowdown in the early 1970s may have contributed to the sharp increase in inflation thereafter.3 Let me try to make a rough estimate of the direct disinflationary effect of the acceleration in productivity and the associated decline in the short-run NAIRU. The first step is to estimate the acceleration in trend productivity. If productivity has increased from 1-1/2 percent during the period from the early 1970s to the mid-1990s to the upper end of the estimate today - say 3-1/2 percent - then the total acceleration in productivity would be 2 percentage points. If nominal wages do not immediately respond, the growth in labor costs will fall by precisely the same amount as the increase in the growth of labor productivity. In this case, the direct disinflationary effect simply equals the acceleration in productivity. However, the magnitude of the direct effect will diminish over time as wage growth catches up to the faster productivity growth, unless of course productivity continues to accelerate, as has been the case over the last several years. Because wages have likely already partially adjusted to the cumulative increase in productivity growth during the last several years, the portion of the acceleration still putting downward pressure on inflation may be somewhat less than 2 percentage points. Indeed, the basic framework I am using to explain the response of the short-run NAIRU to a productivity acceleration was first suggested by Steve Braun in 1984 in a paper that sought, in part, to explain the contribution of the productivity slowdown in the early 1970s to the subsequent rise in inflation. A simple way of judging the magnitude of the direct disinflationary effect is to calculate the difference between the current estimate of trend productivity and a moving average of this trend. The length of the moving average should be an estimate of the time it takes for nominal wages to fully respond to higher productivity growth. In Figure 2, I plot a measure of the direct disinflationary effect using a forty-quarter moving average of trend productivity. A ten-year moving average probably provides an upper end of the estimate of the direct disinflationary effect. As you can see, this upper-end estimate is that the cumulative productivity acceleration is lowering inflation today by 1-1/2 percentage points, for any given unemployment rate. Figure 2 Acceleration in trend productivity Note that when this effect is positive, as it has been since the mid-1990s, it is a disinflationary effect and when it is negative - as it was following the decline in trend productivity growth after the early 1970s - it is an inflationary effect. The power and persistence of the direct disinflationary effect depends on both the pattern of the productivity acceleration and the speed of the adjustment process - specifically how fast wages respond directly to the productivity acceleration. If the adjustment is very drawn out as shown in Figure 2, then the direct disinflationary effect would have been quite large and this effect would continue to be important for some time, even once productivity growth stabilizes. But if the adjustment were more rapid, the disinflationary effects of the acceleration in productivity would have been less powerful and would dissipate more rapidly once productivity growth stabilizes. As you can imagine, with so few observations, the data do not speak loudly about the speed or length of this adjustment process. As a consequence of this uncertainty about the speed of adjustment, I use a range of estimates - a tenyear moving average to capture the case of very sluggish adjustment and a three-year moving average to illustrate a more rapid adjustment. I view this as plausible range, but as I noted, it is difficult to identify the precise speed of adjustment within this range. These alternative moving averages translate into a range of 1/2 to 1-1/2 percentage points for the direct disinflationary effect of the acceleration in productivity growth. I use this range of estimates to calibrate the range for the decline in the short-run NAIRU in Figure 3. Figure 3 The unemployment rate and alternative estimates of the NAIRU In Figure 3, I have assumed that the long-run NAIRU is 5-1/2 percent and that the long-run NAIRU has varied over the period from 1960 until today only in response to the effect of demographic changes on the average unemployment rate.4 As I explain in somewhat more detail in the appendix, I estimate the short-run NAIRU by subtracting the direct disinflationary effect of changes in trend productivity from the estimate of the long-run NAIRU. This yields what I believe is a plausible range for the short-run NAIRU today of 4 percent to 5 percent. This has the convenient property that the bottom end of the range is consistent with the prevailing unemployment rate and would support the view that we might be approaching a type of soft landing - a convergence of growth to trend at a point when the unemployment rate is already at the short-run NAIRU. In this case, there would not be any immediate upward pressure on core inflation. The range also has the property that its mid-point value of the short-run is above the prevailing unemployment rate, implying some upward pressure on core inflation over the near term. 2. The favorable policy choice as the economy adjusts to an acceleration in productivity Policymakers face a choice when confronted with an acceleration in productivity - a choice among very favorable outcomes. They can take the benefits of an acceleration in productivity in temporarily higher output (that is, a temporarily lower unemployment rate) or in lower inflation or in some combination of temporarily lower unemployment and lower inflation. There is no “right” choice here. It depends on policymakers’ preferences, as well as on where inflation is relative to policymakers’ long-run inflation goal at the outset of the acceleration in productivity. The There is, to be sure, also uncertainty surrounding the estimate of the long-run NAIRU. I continue to find the evidence consistent with the relatively stable value for the long-run NAIRU - once adjusting for demographic changes - and expect that much of the disagreement about the NAIRU is really about the implications of the acceleration in productivity for the estimate of the short-run NAIRU. outcome also likely will depend on how quickly policymakers realize that productivity has accelerated and learn how the economy responds to this development. The response of monetary policy to the acceleration in productivity determines this choice. How have the benefits of the acceleration in productivity been taken in this episode? To assess the outcome, it is most useful to look at the pattern of core measures of consumer price inflation - specifically measures of inflation based on the core PCE and the core CPI - during this period. I interpret the evidence as suggesting that we have taken a large part of the benefits of the acceleration in productivity in temporarily higher output and a smaller portion in the form of lower inflation. But let me emphasize again that the outcome of this choice is not simply a reflection of policymakers preferences, but also of how quickly the policymakers came to understand the size of the productivity acceleration and how it was affecting the choices they face. 3. The monetary policy response: the two-step There is considerable uncertainty today about the NAIRU. This uncertainty is usually expressed without regard to the distinction between the short-run and long-run values of the NAIRU. While there is, to be sure, uncertainty about the long-run NAIRU, I expect much of the uncertainty today has followed from the effect of an acceleration of productivity on the short-run NAIRU. At the risk of some oversimplification, recent monetary policy could be viewed as part of a two-step strategy that takes into account the uncertainty about the NAIRU. The first step is to slow the growth in real output to trend to stabilize the unemployment rate at the lower end of the range of estimates of the short-run NAIRU. Thereafter comes step two, which is to apply a more reactive and less preemptive monetary policy. That is, policy is then focused on testing whether or not the prevailing unemployment rate is sustainable with steady inflation. If not, then monetary policy would respond to higher inflation by raising real interest rates. The difficulty of implementing this strategy is increased by the possibility that the short-run NAIRU is a moving target. That is, even if the unemployment rate today is consistent with the short-run NAIRU, the short-run NAIRU may have to rise over time as it converges toward the long-run NAIRU, at least once productivity growth stabilizes (or increases in productivity growth slow by a sufficient amount). 4. A transition to below-trend growth and rising inflation? From the end of 1995 until mid-2000, growth had been above trend and, as a result, the unemployment rate had been declining. For much of the period, core inflation was declining. I believe the economy will ultimately be confronted by a transition and that we may already be in this transition, specifically to slower growth and perhaps also higher core inflation. The consensus private-sector forecast is for the growth in real GDP to slip below trend in the third quarter, then post a modest rebound in the fourth quarter, and continue in 2001 at a pace that will keep the unemployment rate nearly steady. The consensus forecast also appears to be consistent with relatively stable core inflation, implying that the prevailing unemployment rate may be sustainable and that monetary policy may have succeeded in achieving a soft landing. I hope this will be the case. If so, monetary policy would have achieved a soft landing for the second time in the same expansion - a truly extraordinary feat. Even if growth in real GDP remains at or modestly below trend for a period, there are, in my view, two potential sources of upward pressure on inflation over the next few years. The first is the possibility that the unemployment rate today is below the short-run NAIRU. The central tendency for my estimate of the short-run NAIRU, for example, is above the prevailing unemployment rate, suggesting the potential for some upward creep on core inflation. The second source of higher inflation would be the gradual convergence of the short-run NAIRU to the long-run NAIRU should productivity growth stabilize and the direct disinflationary effect wane. Unless the actual unemployment rate rises as this convergence progresses, the rise in the short-run NAIRU would result in further upward pressure in inflation. An important uncertainty in the forecast is the sustainability of the current rate of trend productivity growth. For example, in the near term, the robust pace of capital spending could yield still higher productivity growth - through further increases in the ratio of capital services to labor, one of the principal sources of higher productivity growth. Over a still longer period - and this may be several years to a decade or longer - it is quite possible that productivity growth, after reaching a peak, will then diminish. If the acceleration in productivity reflects the bunching of technological innovations, the completion of their spread will signal that productivity has moved to its new, higher level and the growth in productivity may then diminish to a rate more consistent with its long-run historical average. Such a deceleration in productivity would bring with it the reverse of the favorable conditions that initially accompanied the acceleration in productivity - a choice between a higher unemployment rate and higher inflation and likely a combination of the two. Even if productivity growth stabilizes at its current rate, we are, in my view, facing a transition. This, of course, presumes that my story about the short-run effects of an acceleration in productivity is on the mark. Initially, we faced a choice between temporarily lower unemployment and lower inflation, and experienced a combination of above-trend growth, a declining unemployment rate, and falling core inflation. The choices may now become less favorable - specifically some combination of slower growth and perhaps higher inflation. Of course, because of the acceleration in productivity, such a slowdown may still leave the growth rate of real GDP well above the average that prevailed over the two or more decades preceding the acceleration. If growth turns out to be close to trend for a while, the unemployment rate will stabilize at its prevailing value and core inflation will rise to the extent that the unemployment rate is below the short-run NAIRU. Inflation will rise further over time to the degree that the NAIRU moves toward its long-run value. The most benign outcome in this case might be a period of below-trend growth that would gradually reestablish a sustainable unemployment rate accompanied by a more modest increase in inflation. Given the uncertainty about this analysis - especially about the values of the short-run and long-run NAIRU - it is difficult to design a pre-emptive policy aimed at foreclosing the risk of higher inflation. Monetary policy will, however, need to watch for signs of an upward creep in inflation that would be part of the transition that I have been discussing. It is important to recognize, however, that there are options between a preemptive response based on the relationship between the unemployment rate and the NAIRU and a totally reactive approach of responding only to higher inflation itself. For example, increases in unit labor costs or decreases in profit margins can be precursors of inflation. So it is important to monitor such developments. 5. The interaction of oil price swings and the productivity acceleration During the past several years, inflation performance has also been importantly affected by a series of relative price shocks - particularly swings in oil prices and in non-oil import prices. These shocks have interacted with the disinflationary effect of the acceleration in productivity. For example, in 1997 and 1998, the decline in both oil prices and non-oil import prices significantly reinforced the direct disinflationary effect of the acceleration in productivity, contributing to a decline in overall CPI inflation to just 1.6 percent in 1998. During most of this period, the unemployment rate, though falling, may have remained above the short-run NAIRU - as depicted in Figure 3 - because the shortrun NAIRU was declining as a result of the acceleration in productivity. Beginning in 1999, the direct effect of the rebound in oil prices increased overall inflation, and over the past year, the secondary effects of the rise in oil prices may have boosted core inflation, perhaps reinforcing the effect of an unemployment rate now below the short-run NAIRU. The full effect of the recent rise in oil prices may still be feeding through to the prices of a broader range of goods and services, contributing to a near-term risk of higher inflation - even more so, of course, if oil prices move higher. Going forward, the interaction of these two effects is likely to remain important in shaping the inflation outcome. There is, in my view, a reasonable prospect that each of these two effects will reverse their contributions to inflation over the next couple of years and that balance between them will be important in determining the pattern of core and overall inflation rates. I noted above that we might be nearing or possibly already be in a transition that might include a period of upward pressure on core inflation. If the expectations in futures prices for oil prove correct, however, we may soon be treated to an extended period of decline in oil prices. To be sure, there is enormous uncertainty surrounding such a forecast. In addition, the near-term risks through the winter appear asymmetric on the upside. But, thereafter, the fundamentals - some moderation in the robust pace of global growth and increased investment in oil-producing capacity - point to a gradual but potentially extended decline in the price of oil. A projected drop in oil prices likely underpins the consensus forecast of a decline in overall inflation next year. Going forward, the secondary effects of lower oil prices would help to mitigate the rise in core inflation associated with prevailing and emerging demand pressures. Ultimately, we would still have to deal with the persistent demand pressures on core inflation if the prevailing unemployment rate is below the short-run NAIRU and if the short-run NAIRU begins to converge toward its long-run value. But such a reversal in oil prices would buy some time in addressing this risk. As a result of this balance of forces, we could still achieve a quite benign outcome even if it isn’t the soft landing that many anticipate. No doubt there will be surprises along the way. But the interaction between the continuing adjustment to the acceleration in productivity and further swings in oil prices will likely play an important role in shaping the macroeconomic outcomes and the challenges facing monetary policy over the next couple of years. Appendix: Productivity and the NAIRU The model developed in this appendix supports both the analysis in the paper and the estimates of the magnitude of the disinflationary effect of an acceleration in productivity and of the short-run NAIRU. The model is based on a 1984 Board staff paper by Steve Braun. This paper examined the effect of the productivity deceleration in the early 1970s on inflation afterward. The insights developed by Braun were integrated into the modeling of inflation dynamics in the Board’s large-scale quarterly econometric model. To focus on the implications of the productivity acceleration, I have not incorporated relative price shocks into the model, though it would be straightforward to do so. Equations 1 to 3 set out a simple model of inflation dynamics that assumes that productivity affects wages and prices symmetrically. In this case, an acceleration in productivity has no effect on the relationship between inflation and the unemployment rate. 1. 2. 3. w = a + q - b U + pe p=w-q p = a - b U + pe w = rate of increase in nominal labor compensation p = inflation pe = expected inflation q = trend productivity growth q*= moving average of trend productivity growth U = unemployment rate Equation 1 is a wage-price specification of the Phillips curve. The rate of increase in nominal labor compensation (w) depends on the rate of unemployment (U) and expected inflation (pe). The price level is set as a markup over standardized productivity (the level of productivity adjusted for cyclical effects). Equation 2 is growth rate version of the markup equation, assuming a constant markup; the inflation rate (p) equals the rate of growth in labor compensation less the trend growth rate of productivity (q). This two-equation model of wage-price dynamics can be solved for the inflation rate by substituting equation 1 into equation 2, yielding the price-price specification of the Phillips curve, equation 3. Here, inflation depends on the unemployment rate and expected inflation. Because the productivity term, q, enters symmetrically in equations 1 and 2, it does not appear in equation 3. That is, the inflation rate in this model is unaffected by the growth rate of productivity or any change in the growth rate. Equation 3 can be solved for the value of the NAIRU (U*), by setting p = pe and solving for U. The resulting expression, given in equation 4, is the level of the unemployment rate consistent with any steady rate of inflation, once inflation expectations have converged to this steady rate. In this specification, there is no distinction between short-run and long-run NAIRUs. Equation 1’ presents the key modification of this simple model that caused an acceleration in productivity to have an effect on inflation. In equation 1’, the rate of increase in labor compensation now depends on a moving average of the trend rate of growth in labor productivity (q*) rather than on trend productivity growth itself. The key assumption here is that a change in trend productivity growth affects wage change more slowly than price change - that is, productivity acceleration has an asymmetric effect on wages and prices. This is modeled by assuming that wage change depends on q*, while the inflation equation (based on the markup equation) depends on q. (1’) w = a + q* - b U + pe The implications of this modification in the model can best be seen by substitution of 1’ into 2 and solving for the revised specification of the price-price Phillips curve, equation 3’. The inflation rate now depends on the difference between the level of trend growth and its moving average, q - q*. Whenever trend productivity growth increases, q is greater than q* for a while, and inflation is reduced. Once productivity growth stabilizes, q* ultimately converges to q. (3’) p = a - [q - q*] - b U + pe p = inflation pe = expected inflation q = trend productivity growth q* = moving average of trend productivity growth U = unemployment rate The value of the NAIRU is usually derived by assuming that p = pe (as we did in deriving equation 4) and by setting all shock terms to zero. In this case that means, setting q = q*. We will refer to this specification of NAIRU as its long-run value when the economy has fully adjusted to any shocks or when there are no shocks. The long-run NAIRU in this case is exactly the same as in the simple model, derived in equation 4. It is also useful to derive a short-run or effective NAIRU, allowing for the effect of an acceleration of productivity on the level of the unemployment rate consistent with steady inflation. The expression for the short-run NAIRU (U**) is derived in equation 4’. If there is an acceleration in productivity, q will exceed q* for a while, and the short-run NAIRU will fall below the long-run NAIRU. That is, the disinflationary effect of an acceleration in productivity allows the economy to operate at higher utilization rates (a lower unemployment rate) before encountering upward pressure on the inflation rate. (4’) U** = U* - (1/b) [q - q*] U** = short-run or effective NAIRU However, once productivity growth stabilizes at a higher level, q* will eventually catch up to q, and the disinflationary effect will gradually diminish and then completely disappear. During the period of adjustment of q* to q, the short-run NAIRU will rise and ultimately converge to the long-run NAIRU. During this transition, inflation pressure will build if the unemployment rate remains unchanged. In the charts presented in the paper, I put some rough quantitative dimensions on the short-run effect on inflation of an acceleration in productivity. To do so, we simply have to calculate the q - q* term. Three steps are required here. First, we need an estimate of q, trend productivity growth. Second, we need to specify the period over which the moving average of q, q*, is to be calculated. Finally, we need to determine the value of the coefficient that measures the effect of the q - q* term on the shortrun NAIRU. There is a broad consensus that trend productivity growth was about 3 percent in the 1960s through the early 1970s and then about 1-1/2 percent until the mid-1990s. As of yet, there is not a consensus about either the magnitude or the time pattern of the acceleration in productivity. The upper end of the range of estimates for trend productivity today is between 3 percent and 3-1/2 percent. I used 3-1/2 percent as the estimate of q today in deriving the estimate of the disinflationary effect of productivity and the range for the short-run NAIRU. This decision obviously maximizes the potential importance of the acceleration of productivity in explaining the relationship between inflation and unemployment in this episode. Although the “curve” plotted in Figure 1 is a fairly simple representation of the historical trends in labor productivity growth, it is similar to estimates produced using more sophisticated statistical techniques. For example, as shown in the graph of actual and trend labor productivity in Figure A.1, a trend measure derived in a production accounting framework (the thick dark line) closely follows the curve shown in Figure 3, reproduced here as the dashed line. This production-based measure of trend labor productivity has three components - capital deepening (derived from Bureau of Labor Statistics (BLS) data on actual capital services, normalized by the FRB/US estimate of trend labor hours); the BLS estimate of changes in labor quality; and trend growth in multifactor productivity (MFP). The latter component is computed using a Hodrick-Prescott filter to extract the trend in actual MFP growth. The next step is to determine the period over which the moving average q* should be computed - that is, the period it takes for q* to converge to q once q stabilizes. There is no economic theory to guide us in selecting the length of this period. All we can do is to seek alternative periods and try to determine which yields the best predictive performance in explaining inflation. As you might suspect, it is difficult to pin down the precise period; this likely reflects the relatively limited experience with accelerations (or decelerations) in productivity growth and thus the paucity of data from which to make this estimate. I compute the two alternative estimates of q*, assuming twelve-quarter and forty-quarter moving averages, both of which appear to do equally well in wage equations. We now have two estimates for the q - q* term in equation 3’. There are some factors that likely damp this effect and some that actually magnify it. A damping factor is the likelihood that firms respond to the acceleration in productivity by increasing their markup, absorbing some of the benefits in higher profits and passing on only a portion of the acceleration in lower inflation. We can capture this by assuming that m is the proportion absorbed in higher profits, so only (1 - m) is passed on in lower inflation. Magnifying the disinflationary effect are the dynamics working from the initial effect on inflation to expected inflation and back to inflation - in effect, the virtuous wage-price cycle that is initiated by the direct disinflationary effect. One final step is required to estimate the effect of the q - q* term on the short-run NAIRU. We have to estimate the coefficient on the q - q* term in equation 4’. Taking into account the effect of the change in the markup, this term is (1-m)/b, and I have assumed that it is equal to unity, a result roughly consistent with estimates of the parameter on the acceleration term in equations corresponding to this model. This yields a range for the short-run NAIRU of 4 percent to 5 percent in Figure 2. I then used the mid-point, 4-1/2 percent as a point estimate of the short-run NAIRU in Figure 3. Figure A.1 Growth in adjusted non-farm business labour productivity Reference Braun, Steven. “Productivity and the NIIRU (and other Phillips Curve Issues).” National Income Section, Working Paper 34. Board of Governors of the Federal Reserve System, June 1984.
|
board of governors of the federal reserve system
| 2,000 | 10 |
Remarks by Laurence H Meyer, a member of the Board of Governors of the US Federal Reserve System, at the University of Wisconsin, LaCrosse, Wisconsin on 24 October 2000.
|
Laurence H Meyer: The politics of monetary policy - balancing independence and accountability Remarks by Laurence H Meyer, a member of the Board of Governors of the US Federal Reserve System, at the University of Wisconsin, LaCrosse, Wisconsin on 24 October 2000. * * * It is widely believed, at least among central bankers, that “independence” is a prerequisite for achieving the goals that traditionally have been assigned to central banks - specifically for achieving price stability. “Independence” does not mean literally independence from government, because central banks here and abroad are almost always part of government. The relationship of central banks to the rest of government is, in practice, therefore much more complex than the term “independence” might suggest. The motivation for granting independence to central banks is to insulate the conduct of monetary policy from political interference, especially interference motivated by the pressures of elections to deliver short-term gains irrespective of longer-term costs. The intent of this insulation is not to free the central bank to pursue whatever policy it prefers - indeed every country specifies the goals of policy to some degree - but to provide a credible commitment of the government, through its central bank, to achieve those goals, especially price stability. Even a limited degree of independence, taken literally, could be viewed as inconsistent with democratic ideals and, in addition, might leave the central bank without appropriate incentives to carry out its responsibilities. Therefore, independence has to be balanced with accountability accountability of the central bank to the public and, specifically, to their elected representatives. It is important to appreciate, however, that steps to encourage accountability also offer opportunities for political pressure. The history of the Federal Reserve’s relationship to the rest of government is one marked by efforts by the rest of government both to foster central bank independence and to exert political pressure on monetary policy. The purpose of this paper is to clarify the relationship of central banks within government, to explain the nature, degree of, and rationale for the independence afforded to many central banks - with a special focus on the role of the Federal Reserve within the US government - and to discuss the balancing of independence and accountability in principle and in practice. Independence The dictionary defines independence as being free from the influence, guidance, or control of another or others. As applied to central banks, that translates into being free from the influence, guidance or control of the rest of government, meaning both the executive and legislative branches in the United States. It is useful to distinguish two types of independence for central banks: goal independence and instrument independence. If a central bank is free to set the final objectives for monetary policy, it has goal independence. If a central bank is free to choose the settings for its instruments in order to pursue its ultimate objectives, it has instrument independence. Most central banks have specific legislative mandates and therefore do not have goal independence. Thus the “independence” of “independent” central banks is instrument independence under which the central bank has authority to choose settings for its instruments in order to pursue the objectives mandated by the legislature, without seeking permission from, or being overturned by, either the executive or the legislature. However, countries vary considerably in the specificity of the mandated goals and hence in the degree of discretion of central banks in the conduct of monetary policy. The need for independence Central bank independence is designed to insulate the central bank from the short-term and often myopic political pressures associated with the electoral cycle. Elected officials have incentives to deliver benefits before the next election even if the associated costs might make them undesirable from a longer-term perspective. This phenomenon has been called the political business cycle in which preelection stimulus leads to higher inflation followed by monetary restraint after the election. On the other hand, it appears that elected officials in many countries apparently understood the incentives under which they operate and have structured charters for their central banks that, in effect, tie their own hands - that is, limit political interference with monetary policy to enhance the prospects of achieving and maintaining price stability. Nevertheless, the urge to exert political pressure - to support the objectives of the Administration as well as those of the Congress, to take the US case, and other times to support the re-election of the President or of congressional incumbents - sometimes becomes irresistible. At such times, the tradition of independence at the Fed, the leadership of its Chairman, the influence of long terms for governors, and the presence of Reserve Bank presidents on the Federal Open Market Committee (FOMC) become especially important. In addition, budget priorities and monetary policy objectives can be in conflict. The executive branch generally wants to keep the cost of servicing its debt low, and this preference might be at odds with the need for monetary policy to vary interest rates to maintain price stability. This tension has been present during both World Wars and for several years following World War II. Finally, especially in countries where debt markets are not well developed, central banks might be called upon to finance budget deficits by printing money, again interfering with maintaining price stability. The Federal Reserve, for example, was asked to directly underwrite government debt during World War I, but a statutory prohibition on directly purchasing government debt was later added to the Federal Reserve Act. Some have worried that even an independent central bank could succumb to the temptation to stimulate the economy today at the expense of higher inflation in the future. This is referred to as the problem of time inconsistency. That is, the central bank has an incentive to commit itself to price stability and then to renege on this promise in order to gain employment in the short run with relatively little initial sacrifice in the form of higher inflation. In the long run inflation would rise and the central bank would either have to tolerate the higher rate of inflation or push output below potential for a while to restore price stability. Once the public understood this process, moreover, it would expect higher inflation, so that, in the longer run, the result could be higher inflation without any short-run gain in output. Several solutions to the time inconsistency problem have been offered. First, the rest of government could impose a rule on the central bank, restricting its ability to play the game described above. The rule would ensure a credible commitment to price stability, thereby anchoring the public’s expectations and removing the inflationary bias that otherwise might result. Second, the government could appoint conservative central bankers - central bankers with a greater commitment to price stability than the public - and thereby offset the inflationary bias that would otherwise arise. Third, central bankers could be forced to operate under performance or incentive contracts, whereby they could be penalised for failure to maintain price stability. The Governor of the Bank of New Zealand operates under such a performance contract; he can be removed from office for failure to achieve his inflation target. I have never found the literature on time inconsistency particularly relevant to central banks. Surely central banks realise they are facing a repeated game, not a one-time game. They will therefore be reluctant to undermine their credibility over the longer run by pretending to pursue price stability while stimulating the economy for short-run gain. Long terms and other institutional ways of insulating central banks from short-term political pressures allow central bankers to take this longer view and make them less likely to follow time-inconsistent policies. Still, the problem highlighted in the time inconsistency literature may reinforce the case for both a price stability legislative mandate and instrument independence for the central bank. Independence also is likely to reinforce the credibility of a central bank’s commitment to price stability. This enhanced credibility may then yield additional benefits. First, it could allow the central bank to reduce the cost of lowering inflation. It is generally agreed that to lower inflation monetary policy must reduce output for a while, relative to potential, by reducing aggregate demand. The resulting loss of output during the transition to lower inflation is a measure of the cost of reducing inflation. The more quickly inflation expectations fall, the more rapidly will inflation itself decline, and the lower will be the cost of reducing inflation. A credible central bank could also be more effective in conducting stabilisation policy. If aggregate demand were to slow, a stimulative monetary policy move would be less likely to undermine confidence in the central bank’s pursuit of price stability when the central bank is independent (and has a price stability mandate). In addition, if inflation moved upward, inflation expectations would be less likely to follow immediately, making it easier for the central bank to contain inflation. Evidence on the benefits of independence An extensive literature examines the relationship between the independence of the central bank and economic performance. The empirical studies generally find an inverse relationship between measures of central bank independence and both average inflation and variability of inflation, at least for developed economies. These are only correlations, however, and thus do not prove causation. The inverse relationship could also reflect the fact that countries with less aversion to inflation might be less likely to have independent central banks. In addition, there is no consistent evidence of a relationship between central bank independence and real economic activity nor consistent evidence that central bank independence lowers the cost of reducing inflation or increases the effectiveness of stabilisation policy. On balance, the evidence on the benefits of central bank independence is strong enough to satisfy those who find the theoretical arguments persuasive, although its is not strong enough to convince skeptics. The history and evolution of central bank independence A century ago there were only 18 central banks, 16 in Europe, plus Japan and Indonesia. Today there are 172 central banks and over recent years the number of central banks that claim some degree of independence within government has steadily increased. More central banks have become independent in the 1990s than in any other decade since World War II. Changes in Britain, Japan and continental Europe made 1998 a banner year in the history of central bank independence. The Bank of England, one of the oldest central banks in the world, was founded by an act of Parliament in 1694. It was involved in commercial activity until the end of the 19th century, but it had gradually shifted during those 200 years toward exclusive focus on central bank activity. The Bank of England had substantial independence for much of the 18th and 19th centuries, but by the 20th century it had essentially become an agency of the British Treasury. Then, in June 1998, it was reborn as an independent central bank under the current Labour government. The Bank of Japan gained operational independence in April 1998. The Bank is still not legally independent, a status prevented by the Japanese constitution. In addition, representatives of both the Ministry of Finance and the Economic Planning Agency attend meetings in a non-voting capacity. But before then, the Ministry of Finance could require the Bank to delay implementation of a change in policy; now it can only ask. Recently, the Ministry of Finance indeed asked the policy committee of the Bank of Japan to delay a decision to raise the Bank’s target interest rate. In an exercise of the Bank’s newly attained power, the policy committee rejected the request. The European Central Bank (ECB) began operating on 1 June 1998, and assumed responsibility for monetary policy in the euro area on 1 January 1999. The ECB is the world’s first supranational central bank and probably qualifies as the most independent central bank in the world. The charter for the European System of Central Banks (composed of the ECB and the national central banks of the member countries) is an international treaty that can be changed only by unanimous consent of its signatories. With its supranational status, the ECB is further removed from the political pressure of national governments than even the most independent national central banks. In addition, there is no political counterpart to the supranational ECB. The European Parliament carries out oversight hearings on monetary policy but does not have any authority with respect to the ECB. Independence and the Federal Reserve System The Federal Reserve, created in 1913, was established as an independent central bank - although, at the time, it was given no clear concept of its role in the conduct of monetary policy. The only reference to policy goals in the original Federal Reserve Act was that the Federal Reserve was responsible for providing an elastic currency - that is, one that would expand as appropriate to accommodate the need for additional transactions as production and spending grew. The major question for the founders was the degree to which the US central bank should be a public or a private institution. Bankers wanted a largely private central bank. Populists wanted a public institution. President Wilson and Congressman Glass steered a middle course. There would be a Federal Reserve Board that was completely public and Federal Reserve Banks that would have significant characteristics of private institutions. During the first half century of Federal Reserve history, the Congress continued to focus more on issues involving the structure of the Federal Reserve than on providing a clear legislative mandate for monetary policy or oversight of the conduct of monetary policy. A former Fed governor, Andrew Brimmer, in a 1989 paper entitled “Politics and Monetary Policy: Presidential Efforts to Control the Federal Reserve”, describes the record of almost “continuous and at least public and vigorous conflicts” between Presidents and the Federal Reserve. In his view, twelve of the fourteen Presidents between the founding of the Federal Reserve and the time he was writing from Woodrow Wilson to George Bush - had “some kind of public debate, conflict, or criticism of Federal Reserve monetary policy”, the exceptions being Calvin Coolidge and Gerald Ford. He alleged that Presidents resented the delegation of monetary policy by the Congress to an independent Federal Reserve and sought ways to bring monetary policy under their influence, often by exerting direct political pressure on the Federal Reserve, but principally through the appointment process. Examples of the latter cited by Brimmer include Nixon, believing that the Federal Reserve had cost him the election in 1960, replacing Chairman William McChesney Martin with Arthur Burns in February 1970 when Martin’s term expired; Carter, appointing William Miller to replace Chairman Burns in 1978; and Reagan, appointing Alan Greenspan as Chairman in 1987. For the most part, their best efforts to appoint sympathetic choices as Chairmen have, in Brimmer’s judgement, been frustrated by the systematic tendency of Chairmen and other Board members to insist on exercising their congressional mandate. Thomas M Havrilesky, in a 1992 book, also provides an account of, and some attempts to measure, the intensity of political pressure over time, based on the number of comments on monetary policy made by Administration officials, including the President, and by members of the Congress. He concludes that there was little pressure from the executive branch during the Eisenhower and Ford Administrations, but many more such efforts in the Kennedy, Johnson and Nixon Administrations. My experience on the Board is that the Clinton Administration has respected the independence of the Federal Reserve to a degree that, given the accounts of others, may exceed that of any previous Administration. To be sure, President Clinton has had opportunities to make appointments to the Federal Reserve Board and he has twice reappointed Alan Greenspan as Chairman. But to my knowledge the Administration has never made any public or private effort to influence monetary policy. The Federal Reserve has been technically independent of the President from the beginning, even though the Secretary of the Treasury and the Comptroller of the Currency originally sat on the Board. Although it is a creature of the Congress, the Federal Reserve Act delegated control over the currency to the Board and Congress insulated the Federal Reserve from elective politics to a large degree. The current structure of the Federal Open Market Committee was introduced in the Banking Act of 1935, which became effective in March 1936. At that time the Secretary of the Treasury and the Comptroller of the Currency were removed from the Board.1 The terms of governors were extended from ten to fourteen years and the Chairman and Vice Chairman were made appointees from within the Board with four-year terms. This structural change is often viewed as allowing the culture of independence to flourish at the Fed. The legislation was also a battle between the Administration and the Congress. The Administration wanted to shift the power over monetary policy toward the centralised and presidentially appointed Federal Reserve Board governors, a group they had a better opportunity to influence through the appointment process. The Congress partly resisted and diluted the control of the Administration by allowing a role for the Reserve Bank presidents on the FOMC. During both World Wars, Treasury wanted to issue securities at low interest rates to ease the burden of financing and the Fed went along because it felt bound to facilitate wartime financing. In addition, during World War I, Reserve Banks bought most of the government’s first $50 million certificate issue directly from the Treasury despite strong objections from some System officials. Such direct purchases were later eliminated and the statutory prohibition on direct underwriting of government debt is today considered one of the principal protections of the independence of a central bank. After World War I, the Treasury opposed raising the discount rate to combat inflation, but the Fed did so anyway. During World War II, the Fed sacrificed its independence by agreeing to peg the Treasury yield curve to ensure low rates for wartime financing. After the war, the Fed wanted to resume an independent monetary policy, fearing that it would otherwise become an engine of inflation, but the Treasury was still concerned about minimising the service cost of the debt. To resolve this conflict, an agreement was negotiated in 1951 by Assistant Secretary of the Treasury William McChesney Martin and Fed officials. The Congress, led by Senator Paul Douglas, also played an important role through its support for Federal Reserve independence. Under the terms of the Accord, as it came to be known, the Fed was no longer obligated to peg the interest rates on Treasury debt, but it was agreed that active consultation between the Fed and Treasury would continue. That active consultation continues today. From the end of World War II until the mid-1970s, the mandate for monetary policy was based on the Employment Act of 1946. This legislation set out a general mandate for the government. Although it did not explicitly refer to the Federal Reserve, it was widely understood that the act applied to the central bank as a part of government. The act identified the government’s macroeconomic policy objectives as fostering “conditions under which there will be useful employment opportunities … for those able, willing, and seeking to work, and to promote maximum employment, production, and purchasing power”. Conflict between the executive branch and the Federal Reserve erupted dramatically in December 1965. President Johnson did not want the Administration’s stimulative fiscal policy undermined by restrictive monetary policy. Chairman Martin supported an increase in the discount rate as an appropriate step to contain the risk of higher inflation. A key vote occurred on a proposed increase in the discount rate at a Board meeting on 3 December. Although the President tried to influence the Chairman’s position, and others in the Administration put pressure on other members of the Board, the Board of Governors voted 4-3 to support the Chairman. Following the vote, the President summoned the Chairman to the President’s ranch in Texas. But the vote stood. The independence of the Fed was preserved and indeed used for precisely the purpose it was intended. Subsequently, virtually everyone agreed it had been the correct decision. The system worked. During the debate on the Banking Act of 1935, Carter Glass expressed the view that having the Secretary of the Treasury on the Federal Reserve Board resulted in enormous influence by the Administration over the decisions of the Board. He felt he had been able to get the Board to do whatever he wanted when he was Secretary of the Treasury in 1919. Certainly his predecessor as Treasury Secretary, W G McAdoo, was a dominant presence whenever he attended Board meetings. At the time, this arrangement didn’t completely snuff out Federal Reserve independence because the Reserve Banks were important in formulating policy (more important than the Board through much of the 1920s). When the FOMC was established in 1935, ensuring that a Board with all seats filled would have the majority of the votes on policy decisions, it became imperative to end Administration influence by removing the Secretary of the Treasury and the Comptroller of the Currency from the Board. The Congress became more involved in the monetary policy process in the 1970s. This was a response to both poor economic performance and changing views about the importance of monetary aggregates in shaping economic developments, especially inflation. Inflation began to rise in the late 1960s and escalated further in the 1970s. During this period, monetarism was an increasing influence, with its focus on the importance of limiting the rate of growth of the money supply to control inflation. But it was the sharp recession in 1974-75 that really provoked the Congress to provide more detailed instructions to the Federal Reserve about the objectives that should guide monetary policy. In 1975, the House and Senate passed Concurrent Resolution 133 calling on the Fed to lower long-term interest rates and expand the monetary and credit aggregates to promote recovery. The Fed was also instructed to set money growth targets and to participate in periodic congressional hearings on monetary policy. For the first time, the Congress explicitly identified the objectives for monetary policy. The same language about the objectives applies today. Still, with its focus on the conduct of monetary policy at a point in time (rather than on general guidelines on policy objectives to be applied over time), the resolution was a clear instance of action by the Congress to intervene and influence monetary policy. The monetary policy objectives written into the concurrent resolution were added by an amendment to the Federal Reserve Act in 1977 and were further elaborated in the Full Employment and Balanced Growth Act of 1978, often referred to as the Humphrey Hawkins act after its co-sponsors. Another clear attempt at political interference emerged in February 1988 when an undersecretary of the Treasury sent a letter to Federal Reserve officials urging them to ease monetary policy. The request was promptly and publicly rebuked by Chairman Greenspan. Having an attempt at political pressure become public and be sharply rejected was an unusual event in the history of the relationship between the executive branch and the Federal Reserve. The reporting requirements in the Humphrey-Hawkins act expired in May 2000. As a result, the Congress is now reconsidering the monetary policy oversight process. In part because the link between money growth and nominal spending appears to be less tight than it was earlier, the role of money growth in monetary policy deliberations has diminished and it appears likely that the Congress will no longer require the Fed to set and report money growth ranges. However, the current language about the objectives of monetary policy seems likely to be retained, as does semiannual testimony on monetary policy. Sources of independence Central bank independence is in part the result of formal institutional features typically incorporated in the legislation creating and defining the central bank. The legislation creating an “independent” central bank - or in many cases revisions to such legislation - often entirely takes away goal independence by mandating objectives for monetary policy, but otherwise sets up a structure that confers and protects instrument independence. The most important requirement for instrument independence is that the central bank be the final authority on monetary policy. That is, monetary policy decisions should not be subject to veto by the executive or legislative branches of government. Instrument independence is further protected if other institutions of government are not represented on the monetary policy committee. A lesser protection would be to allow government representation but only in a non-voting capacity. Instrument independence is further facilitated by long, overlapping terms for members of the monetary policy committee; by limited opportunities for reappointment; and by committee members not being subject to removal except for cause - where “cause” refers to fraud or other personal misconduct but explicitly excludes differences in judgment about policy. An intangible contributor to independence, but arguably the most important, is the appointment of a capable, respected, politically astute, and “independent minded” chairman. A third important protection of independence is achieved by freeing the central bank from the appropriations process. Many central banks have been granted the seignorage function - issuing currency for the government - and cover the cost of their operations from the earnings on their portfolio of government securities acquired in the process, returning the excess to the government. Finally, it is critically important to ensure that the central bank will not be required to directly underwrite government debt. As I noted above, the Treasury or Finance Ministry will have an incentive to keep interest rates low to reduce the cost of servicing the government debt. Indeed, perhaps the first principle of central bank independence is independence from the fiscal authority. If independence is also defined in terms of assuring the ability and commitment of the central bank to achieve price stability, this commitment can be protected by an explicit price stability mandate from the government. That is, a government that explicitly imposes this mandate is less likely to interfere in a central bank’s pursuit of this objective. Independence, by this definition, is viewed as greatest if price stability is the exclusive objective of monetary policy, or at least the principal objective. Empirical studies of the relative independence of central banks Studies of the economic consequences of central bank independence typically estimate the economic effects by first deriving quantitative measures of the relative independence of central banks and then estimating how this measure is correlated with average inflation, inflation variability and real economic performance. Reviewing three of these studies will help to better understand the meaning and sources of central bank independence and perhaps provide at least some insights into how the Federal Reserve ranks relative to other central banks in terms of independence. Bade and Parkin (1988) ranked the political independence of twelve industrial country central banks on the basis of answers to questions such as Is the bank the final policy authority? and Is there no government official (with or without voting power) on the bank board? Grilli, Masciandro, and Tabellini (1991) also incorporated information on the length of terms of monetary policy committee members and on policy goals of the central bank with respect to monetary policy, specifically whether there is a mandate for monetary stability (including money growth or price stability objectives). Cukierman (1992) also takes into account restrictions on the ability of the public sector to borrow from the central bank. A central bank is more independent if it is protected, for example, from directly underwriting the government debt. Germany (prior to its participation as part of the ECB) and Switzerland have been uniformly ranked the most independent of central banks. The United States fell in the second tier in the Bade and Parkin rankings; was just below the most independent central banks in the Grilli, Masciandro, and Tabellini rankings of eighteen industrial countries; and was tied for fourth place among seventy countries in Cukierman’s rankings. The Federal Reserve lost points in these rankings because of the brevity of the Chairman’s term (less than five years) and the failure to single out price stability as the unique or principal objective. Of these studies, I prefer Bade and Parkin’s methodology for ranking independence because they included only those institutional characteristics that afforded a measure of independence to the central bank. Grilli, Masciandro, and Tabellini and Cukierman also included in their measures the nature of monetary policy objectives, ranking independence higher if there is a price stability objective and, in Cukierman’s case, higher still if price stability is the only or at least principal objective. In the latter case, a central bank with more discretion - for example, as a result of multiple objectives, as in the case of the United States - is ranked as less independent than a central bank that has little discretion on account of a single, precisely defined price stability objective. Of course, defining independence to involve a mandate making price stability the single or principal objective increases the potential for an inverse relationship between “independence” and inflation. Accountability Accountability means being answerable for one’s decisions. Implicit in being accountable is being subject to discipline from whomever you are accountable to for the failure to live up to your responsibilities. Making the central bank accountable in this way involves, by definition, some compromise of the independence of the central bank. But accountability is the critical mechanism for ensuring both that the central bank is operated in a way consistent with democratic ideals and that the central bank operates under incentives to meet its legislative mandate for monetary policy. On the other hand, as I noted earlier, steps to increase accountability also create opportunities for political interference. Every organisation’s performance is likely to be enhanced by appropriate incentives. In the private sector, the incentives for a business are profitability and, indeed, survival. In the public sector, other means must be found to provide incentives. Elections of course play this role for elected officials. With central banks having been given an arms-length relationship with the electoral process, some have suggested that central bank policymakers should operate under explicit incentive contracts. But, for the most part, accountability is achieved for central banks both through the appointment process and by regular oversight by the legislature. Accountability is facilitated by providing the central bank with a specific, external (usually legislatively imposed) mandate. Two aspects of designing the objectives for monetary policy are important. First, a single objective (typically price stability) makes the central bank more accountable, because multiple targets always carry trade-offs, at least in the short-run, which are subject to the discretion of the central bank. Second, explicit numerical targets make central banks more accountable than more general targets. Specifically, an explicit numerical inflation target makes the central bank more accountable than a more general commitment to price stability. There are, however, other considerations that are relevant to setting the mandate. First, if there is a single target for a central bank, it will surely be price stability, given that monetary policy is the principal, even unique, determinant of inflation in the long run. While a single target is more precise, few legislatures would tolerate a central bank disclaiming any responsibility for the cyclical state of the economy or at least failing to respond to cyclical weakness. Indeed, given the inescapable trade-off between inflation variability and output variability, a central bank naturally, even inevitably, accounts for the variability of output around full employment when deciding how rapidly to try to restore price stability in cases where shocks or policy mistakes move the economy away from this goal. Inflation-targeting central banks often take account of output variability by defining a period of time over which any return to price stability should occur, typically two years. But such a fixed boundary may not encompass the optimal response to all shocks. A second consideration in setting the mandate is that flexibility can be a valuable asset for policymakers, given the variety of shocks that the economy may face, structural changes that could effect the nature of trade-offs faced by policymakers, and the possibility of short-run trade-offs among multiple targets. So, less precise objectives and multiple targets provide flexibility for the policymaker. To the extent that there is a single and explicit target, accountability is narrowly about performance relative to that target. On the other hand, when there are multiple targets and hence inherent shorter run flexibility and less precisely defined targets, the oversight by the legislature will typically focus more broadly on the judgments that the central bank has made in pursuing its legislative mandate. A second source of accountability is through the reappointment process. If terms are short and especially if the Chairman and other voting members can be reappointed for additional terms, more control can be exercised through the appointment process, and committee members can more easily be held accountable for their policy votes. This is a clear example of the trade-off between independence (facilitated by long terms without the possibility of reappointment) and accountability (facilitated by short terms with opportunities for reappointment). As I noted earlier, Federal Reserve Board governors are appointed by the President, subject to Senate confirmation, for nominal fourteen-year terms. Such long, overlapping terms facilitate independence. However, if a Board member resigns before his or her term has expired, the successor is appointed for the remainder of that term. At the end of a partial term, a governor can be reappointed for a full term, but reappointment is at the discretion of the President and is again subject to confirmation by the Senate. Once a full term has been served, no reappointment is possible. The average actual tenure of governors has been between five and six years over the last twenty-five years. However, the term of the Chairman and the Vice Chairman of the Board of Governors is only four years and both can be reappointed for additional terms as Chairman and Vice Chairman for as long as they remain on the Board. Such short and renewable terms reduce independence but facilitate accountability. In addition, they provide an important opportunity for the President to try to influence monetary policy decisions by pressures exerted on the Chairman subsequent to appointment. To a lesser degree, appointment of governors and direct pressure on them are further avenues of political influence that have been employed, at least on occasion. The authors of the Banking Act of 1935, which established the FOMC in its modern form, implemented the system of long overlapping terms for governors and shorter renewable terms for the Chairman and Vice Chairman. It seems to me they made a conscious effort to balance independence and accountability. The short, renewable term for the Chairman would enhance accountability and encourage a strong working relationship between the Chairman and the executive and legislative branches. On the other hand, the long and effectively non-renewable terms for governors would protect the fundamental independence of monetary policy. So the Federal Reserve loses points in some indices of independence because of the short term of the Chairman, but the resulting balance between independence and accountability has, in my view, contributed over the years to a successful relationship between the central bank and the rest of government. Transparency and disclosure Transparency and disclosure are also essential to accountability. Transparency refers to being easily understood. With respect to monetary policy, it refers to the immediacy with which the public learns of policy decisions and the amount of information provided about the rationale for policy actions and the assessment of how possible future developments bear on policy. The legislature, for example, needs information about the policy actions and an understanding of the rationale for the policy if it is to be able to hold the central bank accountable. In addition, it is generally agreed that markets work better with more complete information, although some worry that a continuous flow of information on the leanings of members of the policy committee can result in excessive volatility in financial markets. The Congress has, over time, made efforts to increase the transparency of the monetary policy process and widen the scope of disclosures of monetary policy decisions and of the discussions leading up to those decisions. Historically, the Federal Reserve has responded initially by trying to preserve the status quo, but over time it has come to accept and even appreciate the evolution toward greater transparency and disclosure. Nevertheless, continuing concerns have been the potentially deleterious effect of still greater transparency and disclosure on the effectiveness of the deliberative process and the possible effects on the volatility of financial markets. Transparency is influenced by the operating procedures used to implement monetary policy. It is furthered by announcements of policy changes, along with statements explaining the rationale underlying policy actions, and by timely and sufficiently detailed reporting of the substantive discussions leading to the policy decisions. The Federal Reserve used to set its policy in terms of the tightness of reserve positions (so-called “reserve market conditions”). This was a very imprecise way of setting and explaining policy, making it more difficult for the public and the Congress to monitor and evaluate monetary policy decisions. One of the developments of FOMC practice under Chairman Greenspan was to set policy explicitly in terms of a target rate for the federal funds rate.2 Initially, these decisions were not directly conveyed to the public. Instead, the Federal Reserve Bank of New York altered the way in which it implemented open market operations to alert financial markets to the change in policy. In February 1994, the Federal Reserve began announcing on the day It could be argued that policy began to evolve in this direction in 1983 when the FOMC set targets for borrowed reserves. The latter target was, in effect, a proxy for the federal funds rate. But the proxy was imprecise and it wasn’t until the late 1980s that the committee became clearer about its federal funds rate expectation. of each meeting any change in its federal funds target and formalised that decision in February 1995. At the same time, it began to offer a brief statement explaining the rationale for the policy change. A policy of issuing a statement even when there was no change in policy was implemented last year. The effect of monetary policy derives not only from the explicit policy actions taken, but also from expectations about future policy. Until quite recently, the Federal Reserve opposed earlier release of its directive or minutes precisely because that would provide some hints about prospects for future policy and this could result in volatility in financial markets. Today, however, not only does the FOMC immediately announce its policy decisions and provide a rationale for policy changes, it also reveals whether the committee believes the risks to achieving its goals are balanced or unbalanced and, if unbalanced, in what direction. Since the early 1980s, the so-called tilt had been part of the directive, but in May 1999 the FOMC began to report changes in the tilt on the day of the meeting and, since that time, the markets have focused considerable attention on what the Federal Reserve says about the future. Transparency is also enhanced by disclosure - including the announcement of policy actions and of the rationale for policy actions, the release of a summary in the minutes of the Committee’s substantive discussion about the economic outlook and the appropriate course of policy, and testimony before the Congress. The Federal Reserve releases minutes of each meeting shortly after the following meeting in effect, a delay of six or seven weeks. Some have encouraged a further step toward enhanced transparency by speeding this release. The transcripts of an entire year of meetings - lightly edited verbatim records of the deliberations, with redactions for sensitive information related to foreign governments or specific businesses or individuals - are released with a lag of five years. The decision to do so was made in February 1995. Until late 1993, it was not widely known - inside or outside the Federal Reserve - that verbatim records of FOMC meetings (transcribed from audio tapes) were retained. Once the minutes were released, the tapes themselves were erased - actually taped over - in conformance with Committee directives. When the Congress learned of the availability of the transcripts, they demanded that they be released to the public, and the current procedures were negotiated between the Federal Reserve and the Congress. The transcripts are a useful historical record of FOMC meetings and provide scholars as well as current Board members with insights into the monetary policy process and its evolution over time. The Federal Reserve and the executive branch Independence and accountability - as important as these concepts are - do not effectively convey the full richness of the relationship of the Federal Reserve to the rest of the government. The relationship in practice is, after all, as much informal as formal. Informal relationships and, even the effectiveness of formal ones, also have a lot to do with personalities as well as with institutional history and traditions. And these informal relationships are, in turn, important channels for political influence. One important reason for consultation and communication between the Federal Reserve and both the executive and legislative branches is the desirability of effective coordination of monetary and fiscal policies. The executive and legislative branches collectively set fiscal policy, while the central bank sets monetary policy. The appropriate monetary policy must give substantial weight to prevailing and expected fiscal policies. To a lesser degree, the same principle is at work in the formation of fiscal policies. I say to a lesser extent because I believe fiscal policies since the early 1980s have been set more on the basis of longer-run considerations - such as promoting growth - than for short-run stabilisation purposes. As a result, stabilisation policy is today principally a concern of the central bank except under extreme circumstances. The forecast of the central bank must consider current and prospective fiscal policies, and monetary policy must adjust to fiscal policy changes, while the executive and legislative branches are somewhat freer to implement changes in long-run strategies as the political consensus allows or dictates. On the other hand, the absence of active stabilisation efforts by the executive branch (and the Congress) might increase their frustration about the stabilisation policies pursued by the Federal Reserve - or perhaps more likely at the perceived failure of the Federal Reserve to pursue full employment aggressively enough - and increase efforts at political interference with the conduct of monetary policy. The relationship between the Federal Reserve and the executive branch has evolved over the last half century toward a more informal and less structured relationship. The Eisenhower Administration established an Advisory Board on Economic Growth and Stabilization (ABEGS) which included the chairman of the President’s Council of Economic Advisers and the Fed Chairman - initially Arthur Burns and William McChesney Martin respectively - plus cabinet members. With Arthur Burns in the lead, the ABEGS functioned as a forum for frequent consultations on the policy mix. During that period, fiscal policy had a more prominent role in stabilisation policy, with monetary policy playing a more supporting role. Some previous Fed Chairmen also have acted as close advisors to the president for example, Mariner Eccles for President Franklin Roosevelt and Arthur Burns in the case of President Nixon - sometimes in discussions unrelated to the coordination of stabilisation policy. The Kennedy-Johnson Administration inherited the recession of 1960-61 and was determined to use fiscal policy to promote recovery. In the prelude to the 1964 tax cut, Chairman Martin was included in policy discussions as part of a “Quadriad” consisting also of the CEA chairman, the Secretary of the Treasury, and the director of the Budget Bureau. Chairman Martin was included mainly to ensure that the Fed did not offset the expected effect of the tax cut. Coordination slackened as Vietnam War spending stimulated the economy, to the alarm of Fed officials, leading to the December 1965 confrontation. This was the most dramatic example of attempted political interference with the conduct of monetary policy after the 1951 Accord. The minutes of the meeting at which the Board decided to raise the discount rate include interesting discussions of the tension between independence and coordination. Some governors wanted to defer to, or at least negotiate further with, the Administration on this issue because they viewed the Administration as having the primary responsibility for the conduct of national economic policy. Should the Federal Reserve frustrate the direction of that policy? And during the congressional hearings that followed there was much discussion about the dangers to a ship with two captains. Today, the interaction between the Federal Reserve and the Administration is more informal but also perhaps more continuous. However, that relationship is less focused on monetary-fiscal policy coordination than on regulatory and international economic issues. This change reflects the smaller role of fiscal policy in stabilisation ever since the early 1980s, when the Reagan Administration shifted the focus to longer-run issues related to encouraging more rapid trend growth. Stabilisation policy since that time has been dominated by the Federal Reserve, with coordination of policies becoming especially important at major turning points in the thrust of fiscal policy - for example, when the Clinton Administration decided to make a reduction in the structural federal budget deficit the centerpiece of its economic policy strategy in 1993. The Secretary of the Treasury and the Chairman of the Federal Reserve meet frequently, many times for breakfast or lunch, often two or three times a week. The meetings are generally short, but not always, with no formal agenda and no staff. These meetings date back to the Treasury-Federal Reserve Accord in 1951 but today, apart from telephone consultations, they are the main source of ongoing contact between the Chairman and the Administration. There are a number of other opportunities for regular contact among Federal Reserve governors and members of the Administration’s economic team. A governor (on a rotating basis) hosts a weekly lunch for senior staff of the Treasury and the Federal Reserve. While the meetings are often social as well as substantive, they are opportunities to discuss issues of mutual concern. Some of the most effective meetings are “theme” meetings, when we agree in advance to focus on a particular issue. Given the Treasury’s respect for the independence of the Fed, participants will rarely discuss monetary policy, although they occasionally touch on the economic outlook. Regulatory issues, debt management, or international economic issues tend to dominate. But the contacts made and refreshed at these meetings are extremely constructive when discussions between the Federal Reserve and Treasury are called for, again most often on regulatory issues. Members of the Board and members of the CEA also meet monthly for lunch. Once again, discussions of the economic outlook and monetary policy are rare. But the discussions often involve interesting issues related to the outlook, such as the sources of the increases in productivity growth and why most other countries have not benefited significantly thus far from the same developments. The President and the Chairman of the Federal Reserve meet occasionally - more recently, generally a couple of times a year. These meetings typically are informal discussions without agendas and without announcements before or after the meetings. They usually also include the Vice President, the Secretary of the Treasury, and the President’s chief of staff. These are typically opportunities for the Chairman to brief the President on the US and global economic outlooks. The frequency of meetings between the Chairman and both the Secretary of the Treasury and the President have varied across Chairmen and Administrations, depending to an important degree on the individuals involved. The Federal Reserve and the Treasury participate in a variety of working groups - including the President’s Working Group on Financial Markets. Treasury and Federal Reserve officials often serve together on US delegations to international organisations - including meetings of G7 and G10 finance ministers and central bank governors; regional organisations such as the Asia Pacific Economic Cooperation Council, the Manila Framework, and the Forum for Latin American Central Bank Governors; the Financial Stability Forum, OECD Working Party 3 and Economic Policy Committee, and G22; and bilateral economic dialogues, for example, with China and India. Before each such forum, it is typical for the US delegation to meet together to coordinate their participation. Consultations were intense during the Mexico crisis in 1995, the Asian financial crisis in 1997-98, and in the discussions about reforming the international financial architecture since these events. The Federal Reserve and the Congress As I noted earlier, the Federal Reserve’s independence is a product of congressional legislation and can therefore be diminished at the will of the Congress (with the President’s approval and subject to override of any veto). The Congress must have such authority if its oversight of the Federal Reserve is to be credible and effective. This power is a rather blunt instrument, providing ample opportunity for the Federal Reserve to take advantage of its independence in the conduct of monetary policy. At the same time, it ensures that the Federal Reserve is extremely respectful of the oversight authority of the Congress and provides some leverage for congressional influence on the conduct of monetary policy. In assessing congressional influence, it is also useful to distinguish between the views of a vocal minority and the consensus view - insofar as it can be ascertained - of the Congress. It is sometimes difficult to separate direct political involvement in monetary policy from essential congressional oversight of monetary policy. Today it seems out of place for the Administration to comment directly on the conduct of monetary policy, though this may reflect the extraordinary relationship between this Administration and the Federal Reserve and the exceptional economic environment of the last several years. Only time will tell. At any rate, for the present, the relationship between the Administration and the Federal Reserve on monetary policy is confined to the President’s making appointments to the Board while the members of the administration and the Board (and especially the Chairman and the Secretary of the Treasury) engage in regular but informal consultations. On the other hand, the Congress cannot fulfill its oversight responsibilities without actively engaging the Federal Reserve in a dialogue about the conduct of monetary policy. The Congress conveys its views on monetary policy through a variety of vehicles, including letters, speeches, statements and questions at hearings, committee reports on monetary policy, and bills and resolutions. The Congress over the years has used a variety of approaches to influence monetary policy. Perhaps most important, the Congress has set the goals for monetary policy in law. In addition, the Senate confirms nominees to the Board of Governors, and individual Senators can hold up Board member confirmations in an attempt to influence policy and appointments. The Congress can, if it decides, pass legislation that directly requires a specific monetary policy action. The Congress can threaten to change the structure of the Federal Reserve - abolish the Federal Reserve at the extreme, or specify particular qualifications for Board members, or alter the composition of the FOMC - in an attempt to influence monetary policy. The Congress can demand an accounting of policy by summoning the Chairman, Board members, and Reserve Bank presidents to congressional hearings, in addition to the formal semi-annual testimonies by the Chairman. The line between oversight and direct involvement in the conduct of policy is perhaps crossed when the Congress passes or even introduces a resolution or legislation that gives specific direction to raise or lower interest rates and, especially, when such directions are accompanied by proposed legislation that would reduce the independence of the Federal Reserve. The history of the past twenty years shows that members of the Congress do try to influence monetary policy, especially when the economy is performing poorly or when interest rates are high or rising, but that the Congress has rarely gone so far as to pass legislation to direct policy. In fact, such legislation has often been introduced, though rarely passed. Indeed, the introduction of such legislative mandates may be thought of as one way the Congress tries to persuade the Federal Reserve to alter its conduct of monetary policy. There are, to my knowledge, only one or two examples of legislation that passed with specific monetary policy directives and thus compromised the delegation of instrument independence to the Federal Reserve. I mentioned previously Concurrent Resolution 133 passed in 1975. At the end of November 1982, forty-two Senate Democrats introduced a resolution calling on the Fed to “achieve low enough interest rates to generate significant economic growth and thereby reduce the current intolerable level of unemployment”. The Democratic House approved this language in its version of the year-end continuing resolution. However, the final language in the continuing resolution included an important qualification that, in effect, left the discretion about the conduct of monetary policy to the Federal Reserve. Specifically, the Congress added the qualifying phrase “with due regard for combating inflation”. This is an excellent example of the broader tension in the relationship between the Congress and the Federal Reserve. On the one hand, the Congress honours the Fed’s independence in establishing policy and, on the other hand, individual members, particularly in difficult economic times, work to influence policy. At the most extreme end of efforts to change the structure of the Federal Reserve, bills have been introduced to repeal the Federal Reserve Act (thereby abolishing the Federal Reserve), to abolish the FOMC, to remove Reserve Bank presidents from the FOMC, or even to impeach Chairman Volcker and all the members of the FOMC at that time. Congressman Henry Gonzalez, a longstanding member and ultimately chairman of the House Banking Committee, brought a special zeal to these efforts and over the years was the author of several such measures. In return for granting the Federal Reserve “independence”, it seems to me that the Congress asks three things of us. First, we must do a good job promoting the objectives that the Congress has identified. Second, we have to accept a certain amount of grumbling about the decisions that impose short-run costs, especially when unemployment is high or policy tightens preemptively to contain what the Fed perceives as inflation risks. We are always the one taking away the punch bowl just as the party is getting good, with members of the Congress among those who always question the timing of any restraint. To be fair, members of the Congress are among the first to congratulate us when we lower rates! And there has, I have to admit, been plenty of praise for the Federal Reserve’s contribution to the recent exceptional economic performance. Third, the Federal Reserve must be fully prepared to get a substantial part of the blame for bad results (whether or not we caused them). The Congress keeps its part of the bargain by leaving the core of our operations alone, so long as things go right, and intervening only around the edges (hearings, speeches, letters, and the introduction of an occasional bill or resolution) to show they remain alert to their oversight responsibilities and reflect the concerns of their constituents. The appointment process is an important element of the relationship with the Congress. Governors are subject to confirmation by the Senate. The confirmation process is a way for the Congress to influence the conduct of Federal Reserve policy, just as the appointment process offers this opportunity to the President. When the President and congressional majority are from different parties, party politics can affect the Federal Reserve and may explain, in part, why today the Board has two vacancies plus a governor who is serving after the expiration of his term (since a governor can continue to serve in such a case until reappointed or until a new governor is appointed and confirmed).3 In recent years, delayed action on nominations for governor or renomination as chairman, has served as a vehicle for some Senators to express displeasure with the conduct of monetary policy.4 Another important relationship with the Congress is through hearings. The Chairman testifies frequently before the Congress, with the one-year record being twenty-five appearances in 1995, although only seven were directly about monetary policy. Other governors testify also, though less frequently, with a range of eight to twenty-two appearances per year in recent years. Typically the Chairman alone testifies on monetary policy. The most important testimony on monetary policy is delivered at semiannual hearings before the House and the Senate that began, as I mentioned, with the now-expired provisions of the Humphrey-Hawkins act. From 1978 until today, these were semiannual appearances, in each case before the Senate and the House. But the Chairman is invited for many other hearings, including appearances before the budget committees, the Joint Economic Committee, and the banking committees. In addition, on rarer occasions, the Chairman and other Board members will visit with some members of the House and the Senate, either at the Board or on the Hill, mostly at the request of the legislators. These meetings are rarely about monetary policy and most focus on regulatory issues, including banking bills and the Community Reinvestment Act, but they are also sometimes about global economic developments, international financial crises, or international financial architecture issues. In addition, contact at the staff level between the Board and congressional committees is common. The Board’s staff is routinely asked for technical assistance in drafting legislation on banking, consumer protection, and amendments to the Community Reinvestment Act and other areas. Finally, members of the Congress often write letters to the Board - individually and in groups typically urging a specific direction for monetary policy. Since joining the Board in June 1996, I have seen numerous such letters - all either expressing concern about high interest rates or, in most cases, urging the FOMC either not to raise interest rates or to lower them. The largest number of signers during this period was eighty, for a 23 September 1996 letter urging the FOMC not to raise interest rates. These letters are perhaps best understood as attempts by the Congress to alert the Fed to the pain of constituents as a byproduct of the conduct of monetary policy - typically when the Fed is preemptively raising interest rates in an effort to prevent higher inflation or trying to unwind an earlier increase in inflation, or not sufficiently stimulating a sluggish economy. Once having admonished us in an effort to make sure we understood the consequences of our policies, the Congress has generally relied on us When Susan Phillips’ term expired on 31 January 1998, it took the Administration a year and a half to nominate Carol Parry for that position. Alice Rivlin announced on 4 June 1999, that she would be resigning from the Board, but the President has not nominated anyone to replace her. In the meantime, the President renominated Vice Chairman Roger Ferguson to a full term as governor after the expiration of his short partial term on 31 January 2000. The Republican leadership of the Senate Banking Committee has refused to hold hearings for either the new nominee, Parry, or Ferguson, reserving the opportunity for the new President to make these appointments and therefore possibly to convert the positions from Democratic to Republican appointments. In the meantime, the Board remains below its statutory number of members and could remain so for the many months after the new President takes office, if past experience is any guide to the time it takes to make appointments and get them through the confirmation process. A practice in the Senate called a “hold” allows a member - through his or her majority or minority leader and without exposing his or her identity - to hold up, virtually indefinitely, a vote on a nomination to the Federal Reserve Board or any other government position that requires confirmation by the Senate. I have first-hand knowledge of this practice, since a hold delayed the confirmation of my nomination for several months. Actually, the hold was applied to the renomination of the Chairman, but the nominations of myself and Alice Rivlin as governors were viewed as part of a package and therefore action on all the nominations was delayed. Even without a formal hold, the chairman of the relevant committee has discretion on whether or not to begin the confirmation process because the chairman controls the scheduling of a hearing. to balance inflation and stabilisation objectives, as is the implicit contract under a regime under which the Congress has delegated instrument independence to the Federal Reserve. Outstanding issues related to independence and accountability In my judgment, the Federal Reserve Act - together with the policymaking structure as amended by the Banking Act of 1935, the policy mandate as introduced in 1977 and 1978, and the informal relationships that have evolved - results in an excellent balance of independence and accountability for the Federal Reserve. Nevertheless, the resulting balance offers opportunities for political influence so that sustaining Federal Reserve independence in practice depends on building a tradition of independence within the Federal Reserve and on the strength of will and public prestige of Chairmen, in particular, but also of other FOMC members, in resisting efforts at political control. Also, controversies linger about whether or not the policy mandate should be refined and whether transparency and disclosure should be further enhanced. The legislative mandate under which the Federal Reserve operates is, as I noted earlier, different from the mandate applied to most other central banks. It explicitly sets out a dual mandate and, should there be short-run conflicts, does not identify any priority between the two objectives. There has long been a small group in the Congress who would like to revise the language related to the policy mandate to elevate the role of price stability to the single or at least principal objective for monetary policy. They press this issue not out of dissatisfaction with the conduct of monetary policy but because they believe such a revised policy mandate would strengthen the credibility of the Federal Reserve’s commitment to price stability and thereby allow the central bank to carry out this commitment in the most efficient way. However, a larger, if less vocal, group in the Congress strongly opposes abandoning a commitment by the Federal Reserve to promote full employment through its conduct of monetary policy. A related issue is the precision with which the objectives should be stated. The mandate, for example, sets out full employment and price stability as objectives but leaves to the Federal Reserve the precise definition of those goals. As recent experience confirms, it would be difficult and unwise to set any numerical target for full employment, given the uncertainty about what that target should be and the likelihood that this target would vary over time with demographic changes in the labour force, government policies, and changes in the efficiency of the matching process between jobs and unemployed workers. The Federal Reserve has never set an explicit numerical target for inflation. Chairman Greenspan has defined price stability as inflation so low and stable that it no longer affects the decisions of households and businesses. However, today, a growing number of governments have set explicit numerical targets for their central banks. A second broad issue has to do with transparency and disclosure. Over the years, there has been an evolution toward greater disclosure and transparency, but some believe that we ought to be looking for opportunities for further progress. The major questions relate to the speed of release of the minutes and of the transcripts. Conclusion The key to the effective operation of the central bank within government is a well-designed policy mandate; a high degree of formal instrument independence; complementary informal relationships to ensure appropriate coordination without undermining instrument independence; a disciplined, regular process of legislative oversight; and a high degree of transparency and disclosure. The result is a good balance among government-mandated objectives, instrument independence, flexibility, and accountability. But this very balance keeps open opportunities for political interference and requires continuing energy and focus - both inside the Federal Reserve and within the rest of government - on sustaining the independence of the Fed. As a result, we should not take the independence of the Fed as unassailable but rather as a principle that has to be defended. References Bade, R, and M Parkin: “Central Bank Laws and Monetary Policy”, Working Paper, Department of Economics at University of Western Ontario, October 1988. Barro, Robert J, and David B Gordon: “Rules, Discretion and Reputation in a Model of Monetary Policy”, Journal of Monetary Economics, vol 12, no 1, July 1983, pp 101-21. Briault, Clive, Andrew Haldane, and Mervyn King: “Independence and Accountability”, Working Paper Series, no 49, Bank of England, April 1996. Brimmer, Andrew F: “Politics and Monetary Policy: Presidential Efforts to Control the Federal Reserve”, before 75th Anniversary Luncheon, Board of Governors of the Federal Reserve System, 21 November 1989. Cukierman, A: Central Bank Strategy, Credibility and Independence: Theory and Evidence. Cambridge, Mass.: MIT Press, 1992. Debelle, G, and S Fischer: “How Independent Should a Central Bank Be?” mimeo, MIT, 1994. Greider, William: Secrets of the Temple: How the Federal Reserve Runs the Country. New York: Touchstone, 1989. Grier, Kevin: “Congressional Influence on US Monetary Policy: An Empirical Test”, Journal of Monetary Economics, October 1991, pp 201-20. Grier, Kevin: “Presidential Elections and Federal Reserve Policy: An Empirical Test”, Southern Economics Journal, vol 54, no 2, October 1987, pp 475-86. Grilli, Vittorio, Donato Masciandro, and Guido Tabellini: “Political and Monetary Institutions and Public Financial Policies in the Industrial Countries”, Economic Policy: A European Forum, vol 6, no 2, October 1991, pp 341-92. Havrilesky, Thomas M: The Pressures on Monetary Policy. Boston, Mass.: Kluwer Academic Publishers, 1993. Kydland, Finn E, and Edward C Prescott: “Rules Rather Than Discretion: The Inconsistency of Optimal Plans”, Journal of Political Economy, June 1977, pp 473-91. Lohman, Susanne: “Optimal Commitment in Monetary Policy: Credibility vs. Flexibility” American Economic Review, vol 82, no 1, March 1992, pp 273-86. Minutes of the Board of Governors of the Federal Reserve System, 3 December 1965. Nordhaus, William D: “The Political Business Cycle”, Review of Economic Studies, vol 42, no 2, April 1975, pp 169-90. Rogoff, Kenneth: “The Optimal Degree of Commitment to an Intermediate Monetary Target”, Quarterly Journal of Economics, November 1985, pp 1169-90. Schwartz, Anna J: “Central Banking in a Democracy”, Western Economic Association, July 1997. Tufte, Edward: Political Control of the Economy. Princeton, N.J.: Princeton University Press, 1978. US House of Representatives. An Act to Lower Interest Rates and Allocate Credit. Hearings on H.R. 212 before the Subcommittee on Domestic Monetary Policy of the Committee on Banking, Currency and Housing, 94th Cong., first sess., February 4-6, 1975. Washington, D C : US Government Printing Office, 1975. US House of Representatives. The Federal Reserve Accountability Act of 1993. Hearing before the Committee on Banking, Finance and Urban Affairs, 103rd Cong., first sess., 13, 19 and 27 October 1993, Washington, D C: GPO, 1994. US Senate. Monetary Policy Oversight. Hearings on S. Con. Res. 18 before the Committee on Banking, Housing, and Urban Affairs, 94th Cong., first sess., 25-26 February 1975. Washington, D C : GPO, 1975. Walsh, Carl: “Optimal Contracts for Central Bankers”, American Economic Review, vol 85, no 1, March 1995, pp 150-67.
|
board of governors of the federal reserve system
| 2,000 | 10 |
Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the Banco de Mexico 75th Anniversary Conference "Stabilization and monetary policy: the international experience", held in Mexico City, on 14 November 2000.
|
Alan Greenspan: The important engine of growth for our mutual economies - the force of globalization Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the Banco de Mexico 75th Anniversary Conference “Stabilization and monetary policy: the international experience”, held in Mexico City, on 14 November 2000. * * * I am honored to be speaking before this distinguished group on the occasion of the Bank of Mexico’s 75th anniversary. Appropriately, major mileposts encourage introspection and a search for perspective. This morning, I shall focus my remarks on the important engine of growth for our mutual economies the force of globalization. Although globalization has its critics, I say with some conviction that the increasing interaction among national economies has engendered benefits that have significantly exceeded their costs over the years. And the clearest way to understand those net benefits in the 21st century is to examine the record of the prior two centuries, both for ways in which the current experience is similar and for ways in which it is different. After touching upon the benefits that closer linkages have provided to us all, I will then discuss the importance of not ceding the progress that we have won thus far. How the world is similar Though economic data are increasingly suspect as we look further back into the past, three regularities in the long sweep of the record strongly suggest that the degree of globalization today is not measurably greater than that prevailing in the century-ago world of our great grandparents. One is the importance of trade in the overall economy. Measures of total trade across industrial countries, including those that simply sum merchandise exports and imports, have grown rapidly relative to total output over the past fifty years. But this growth mostly reverses declines in the ratio of total trade to GDP in the first half of the 20th century. In fact, for many industrial countries, total trade as a share of GDP is not much above levels that were commonplace in the late 19th century. Second, trade in goods was accompanied by substantial trade in assets. A century ago, net capital flows across the major industrial economies were often greater than today when scaled to GDP. This integration of global financial markets was made possible in part by technological advances, including importantly the laying of the transatlantic cable in 1866. And, like today, open markets allowed capital to flow to the most productive uses where prospective returns were judged to be highest. Third, because funds could flow across national borders to their most profitable uses, national investment was not limited to the pool of national saving. Indeed, in the last few decades of the 19th century, saving and investment at the national level apparently were far less correlated with each other than they were for most of the 20th century, suggesting a greater degree of globalization of investment financing in the latter part of the 19th century than existed in the succeeding century relative to the size of our domestic economies. Thus, our great grandparents lived in a world in which the product of their efforts well may have been sent to foreign shores. Quite often, those efforts were funded in part by foreign investors. As a result, what happened in the financial markets of the City of London, however distant, would echo around the globe. Although this system produced inevitable errors of mispricing and panic on occasion, it reliably funded the opening of new economies and the rolling back of frontiers across the Americas. A considerable portion of the most impressive infrastructure built over the centuries - including the center of old Mexico City itself, our system of canals in the United States, and thousands of miles of railroad track bed and bridges in all our countries - provides eloquent testimony to the net benefit of that international trade and finance. How the world differs But we should remember that the world of the 19th century differed in important respects from our own. For one, our great grandparents were more likely to relocate. Given the great waves of immigration in the mid and late 19th century, it was not unprecedented in some countries that migration would change the population by one-tenth in a decade. The erection of hurdles to the free flow of workers since then implies that our national relationship with foreign countries is more likely to reflect commercial interests than lingering ties of earlier origin. It also requires that capital and managers relocate to tap the pool of lower-cost workers available worldwide, helping to explain both the rise of multinational firms and much of the expansion in real wages over time in developing countries. The output of our workers also differs. While ore-laden ships still cross the Great Lakes, quite often goods of far higher value are packed in the hold of a single cargo jet bound for a more distant location. Simply put, the advent of the microchip has allowed producers to increase the value of output while shrinking the physical volume it takes up. The range of innovation in the high-technology industry is truly awesome, bringing new products on line at a staggering pace and directly adding to the advance of output per hour worked in that sector. And as knowledge and skill in harnessing this equipment diffuses through the rest of the economy, other workers generally become more productive as well. That is, advances in the new economy have spilled over to more established goods production. I hesitate to use the phrase “old” economy because I am not sure how much of the truly old remains in an economy where information from global positioning satellites is used to guide “old economy” tractors in the field, robotic arms swing car doors in place on “old economy” assembly lines, and seismic soundings have, during the past decade, doubled the odds of success in that stalwart of the “old economy”, wildcat drilling. Innovations in inventory control and better and more accurate routing have made it cheaper to bring more traditional goods onto the world market. Taken together, modern producers of goods - whether low or high tech - move goods between national markets at a lower real cost and have more means at their disposal to meet foreign demands more flexibly than our great grandparents could have ever imagined. In part for these reasons, world markets are increasingly important for those who produce goods that can be traded. While I noted earlier that merchandise trade as a share of GDP in the United States is not much different today than it was in the late 19th century, it is important to remember that the composition of GDP has changed considerably. One hundred years ago - even fifty years ago agriculture, mining, and manufacturing made up about 40% of US output. Today, that figure stands closer to 20%. On the spending side, governments have tended to take increasing shares of total output. Thus, comparing the 21st to the 19th century, a proportionally smaller industrial base now supports a similar relative volume of trade, suggesting that we are now more reliant on foreign markets. The ongoing revolution in computing and communication has also allowed US producers of services to find foreign buyers as well. At the end of the 19th century, US exports of services were de minimis. Indeed, as late as 1970, service exports amounted to about 1% of GDP. Today, that share stands closer to 3½%. Representative of this progress are the rapid advances in financial services, which have knitted together national markets and added value by searching out prospective high-return firms and projects. The speed at which capital can now cross national borders is reflected in a considerably higher short-term component of capital flows than was the case in the 19th century. In those earlier days, by contrast, the slower pace of round-trip investment curbed the incentive to accumulate shortterm assets. Of course, freely flowing capital brings costs along with benefits. The short-term nature of capital flows implies that their direction can reverse quickly, sometimes with quite disturbing consequences. As opposed to the 19th century, we have mechanisms to help cushion the effects of crises and a willingness to change national monetary policies when the need arises. The role of policy As with our great grandparents, our own economies are made better by our interaction with the wider world around us. International trade in goods, services, and assets are the chief means of facilitating that interaction. And those interactions and connections in recent decades have become stronger. To some extent, we can credit good national policy making for this. The progress in lowering trade barriers since World War II marks the triumph of putting an important idea into practice - that international trade benefits all nations. Indeed, in every nation, those benefits are shared by people spread across quite different income brackets. The pity is that this idea has been well known in the economics profession for the two past centuries. To be sure, some of this recent progress was bred by necessity, as national governments came to appreciate the patent inevitability of globalization. By lowering the costs of transacting and sharing information, technology has reduced market frictions and provided significant impetus to the process of broadening world markets. Expanding markets, in turn, have increased competition and narrowed the ability of governments to influence economic outcomes. In recognition both of the prosperity possible through an open trading system and of their lessened ability to halt that tide, many governments have reduced tariffs and trade barriers and, when necessary, deregulated markets. These actions themselves have further promoted globalization. The risks we face Understanding the process by which this progress has been accomplished highlights a critical risk going forward. Simply put, good economic performance has made it easier to make good economic policy. However, any notable shortfall in economic performance from the exemplary standard of recent years runs the risk of reviving mistrust of market-oriented systems, even among conventional policymakers. Thankfully, such views are not widespread, and most fall quickly to the force of reason. Still, the arguments against the global trading system that emerged first in Seattle and then spread over the past year arguably touched a chord in many people, in part by raising the fear that they would lose local political control of their destinies. As some analysts have noted, protests have arisen not against “economic forums” per se, but rather against “world economic forums”. Clearly, the risk is that support for restrictions on trade is not dead, only quiescent. In many important respects, the past half century has represented an uneven struggle to repair the close linkages among national economies that existed before the First World War. The hostilities bred of war, the substantial disruptions to established trading patterns associated with that conflict, and the subsequent poor economic performance over the next few decades triggered the erection of trade barriers around the world that have taken even longer to dismantle. To repeat that error would be a tragic act of foolishness and waste. Central bankers can make two contributions to ensure an open trading system. For one, we should not hesitate to remind our fellow citizens of the manifest net benefits of free trade in goods, services, and assets, benefits that accrue not only to all trading partners on average but also especially to some of the least fortunate within those trading societies. I would further emphasize that the free market system has proven itself better than all other forms of organization dedicated to harnessing the underlying competitive forces of the division of labor and comparative advantage. For another, we monetary policymakers must keep hold of the anchor provided by price stability so as to support maximum sustainable economic growth over time. By fostering such economic performance, our arguments for free trade and open markets should find a receptive audience.
|
board of governors of the federal reserve system
| 2,000 | 11 |
Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the sixth annual reception for regulators, sponsored by Women in Housing and Finance, Washington, DC on 20 November 2000.
|
Alan Greenspan: Technology and banking Speech by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the sixth annual reception for regulators, sponsored by Women in Housing and Finance, Washington, DC on 20 November 2000. * * * I am pleased to join you this evening, and would like to thank Diane Casey, your former president, as well as Nancy Camm and the current officers for inviting me to participate. Women in Housing and Finance certainly deserves commendation for its twenty very successful years of providing a forum for individuals, particularly women, from across the spectrum of participants in the financial services industry to exchange views on policies and market developments. More important, I want to applaud the energy with which your organization is approaching its next decades of service. Clearly, we have witnessed a rapid evolution of financial markets in recent years, and the likelihood of continuing fundamental change is high. Your work in studying and interpreting these changes will be no less challenging in the years ahead than in the twenty past. The remarkable innovations and adaptations that have permeated many aspects of our economy, especially in the most recent years of your history, are a part of the ongoing process of creative destruction that moves our economic potential forward as new technologies displace the old. Over the past decade, we have witnessed an acceleration of this process of change, not only in the real economy but in finance as well, with innovations that have swept through the banking, mortgage finance, and securities industries. These innovations have brought with them new products and services, and, of necessity, evolution in the approaches required of both managers and regulators. The unbundling of risk that accompanies the deconstruction of financial products into their constituent parts has revolutionized finance, irreversibly transforming the way services are provided, as well as by whom. What is particularly impressive is that, fueled by both computing and telecommunications capabilities, the pace of financial innovation does not appear to be slowing. Technological advance has expanded the scope and utility of our financial products and - as I noted - has increased the ability to unbundle risks. It has also promoted the faster and freer flow of information throughout the financial system: we are quickly moving to real-time systems, not only with transactions but also with knowledge. Of course, the process of change poses challenges to the institutions competing to adapt to the evolving needs of businesses and consumers. And it challenges those of us here this evening - policymakers, supervisors, and regulators - who are being pressed to reevaluate how we meet our responsibilities. Although the safety net necessitates greater government oversight, in recent years rapidly changing technology has begun to render obsolete much of the examination regime established in earlier decades. Regulators are perforce being pressed to depend increasingly on greater and more sophisticated private market discipline, the still most effective form of regulation. Indeed, these developments reinforce the truth of a key lesson from our banking history - that private counterparty supervision remains the first line of regulatory defense. The speed of transactions and the growing complexities of financial instruments have required a focus more on risk-management procedures than on actual portfolios. Indeed, I would characterize recent examination innovations and proposals as attempting both to harness and to simulate market forces in the supervision of banks. The impact of technology on financial services and therefore, of necessity, on supervision and regulation is the critical issue that frames the supervisory agenda as we move into the 21st century. In today’s more complex world, the diversity of financial product choices facing consumers and businesses is truly astonishing. The complexity, as you know, has provided consumers with more choice but presented new challenges as well. In modernizing our banking laws and making them more consistent with marketplace realities, the Congress ensured that the financial services industry can expand and innovate with far fewer artificial constraints. How various financial providers choose to exploit these wider opportunities will be among the more interesting dynamics of the years ahead. What will the financial services industry look like in WHF’s 40th year? Given the rapidity of innovation and technological change, that is impossible to predict with any certainty. Accordingly, none of us can, a priori, lay out an optimal model either for financial services providers or for financial regulators. For policymakers, supervisors, and regulators, I would only suggest some general guidelines for the coming years: proceed cautiously, facilitate and participate in prudent innovation, allow markets to signal the winners and losers among competing technologies and market structures, and overall - as the medical profession is advised - do no harm. We are all fortunate to have the opportunity to be a part of this remarkable transformation of the financial services industry into one that provides more useful financial products and services to a broader spectrum of consumers and businesses. As I see it, the possibilities for education and debate will abound in coming years, and you and your colleagues are well positioned to continue to provide leadership in addressing the broad spectrum of issues that will confront us.
|
board of governors of the federal reserve system
| 2,000 | 11 |
Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the America's Community Bankers Conference, Business Strategies for Bottom Line Results, held in New York, 5 December 2000.
|
Alan Greenspan: Structural changes in the economy and financial markets in the United States Remarks by Mr Alan Greenspan, Chairman of the Board of Governors of the US Federal Reserve System, at the America’s Community Bankers Conference, Business Strategies for Bottom Line Results, held in New York, 5 December 2000. * * * Good morning. It is a great pleasure for me to join America’s Community Bankers for your winter conference. I would like to take the opportunity today to talk with you about some of the structural changes in the economy and financial markets that are challenging both bankers and policymakers. Technological innovation, and in particular the spread of information technology, has revolutionized the conduct of business over the past decade and resulted in rising rates of productivity growth. Accelerated productivity has been elevating standards of living, and it has been containing cost and price pressures, even as the economy operates at unusually high levels of labor resource utilization. Higher prospective rates of return from the application of the newer technologies has led to a surge in business capital spending. And, in recent years, the capitalization of those higher expected profits has boosted equity prices and contributed to a significant pickup in household spending on houses, durable goods, and consumption more generally. Also contributing has been the measurable rise in the turnover of existing homes, engendering a marked increase in realized capital gains. For a long time, those who were advancing funds shared the sanguine expectations of those using the funds for rapid increases in profits and incomes, and credit and equity were available with unusually low risk spreads. During the past couple of years, however, the widespread optimism that was apparent in financial markets has given way to some reassessment of risks and opportunities. This process has been underway ever since the global financial crisis in the fall of 1998. That episode forced many market participants to recognize the potential for international risks to feed back on US markets. Events brought into sharper focus the possibility that liquidity in many markets can dry up simultaneously when fear spurs risk aversion, and an intense, near-term focus on protecting capital values markedly elevates the demand for liquidity. Markets largely recovered from that episode, but an imprint was left in the form of wider credit spreads and more cautious behavior on the part of banks and other lenders. Recently, wariness about risk again has increased as default rates on less than investment-grade bonds have moved higher, debt downgrades have become more commonplace, and many high-flying dot-com ventures have collapsed. More broadly, equity market analysts have been revising down their near-term profit forecasts - with revisions occurring across a range of industries. As a consequence, stock prices this year have given back some of the extraordinary gains posted in recent years, risk spreads have widened appreciably in markets for lower-rated long-term and short-term credits, and - as I’ll be discussing in more detail later - banks report that they have tightened terms and standards on business loans. To be sure, our current circumstances are in no way comparable to those of 1998. Financial markets have continued to function reasonably well, and credit continues to flow, although admittedly with reduced availability to less-than-investment-grade borrowers and at interest spreads sufficiently elevated to press on profit margins of those lower-rated borrowers. Both lenders and borrowers are reassessing their positions in light of an apparent uptick in domestic risks, but the palpable fear that dominated financial markets at the height of the crisis two years ago is not now in evidence. Net funds raised by non-farm, non-financial business are estimated to have risen in November from October levels, though remaining below the rates of earlier this year. Why then, one might ask, is this process of reassessment taking place now? In large part, it appears to be the expected byproduct of the economy’s transition to a more sustainable balance in the growth of demand and supply. The orders and output surge this past year in a number of high-technology industries, amounting in some cases to 50% and more, was not sustainable even in the more optimistic new economy scenarios. Technological innovation combined with intense competition has resulted in some overreaching. Many firms rushed to gain first-mover advantages of the newer technologies. The successful creation of industrywide networking standards was expected to allow these firms to dominate a particular market niche and, thus, to reap a considerable reward in earnings. Demand for high-tech equipment and fiber optics expanded rapidly, but in some segments of the market available supply appears to have increased even faster. To the extent that some aspiring entrepreneurs entered the tail end of a short-term boomlet, there was bound to be some disappointment. In many respects, the situation may be analogous to a phenomenon of which I am sure many of you are all too painfully familiar - the tendency to overbuild in commercial real estate when low vacancy rates prompt commercial building starts well beyond the point that, on completion, could be supported by the ongoing growth in demand. Problems have even arisen among a number of well-established companies whose forays into uncertain newer technologies have come up short. To a considerable degree, then, the current shakeup in some segments of the telecom and other high-tech sectors seems to reflect an inevitable winnowing process as the market begins to draw firmer conclusions about which firms will be able to establish a long-lived market niche and which will not. Of course, these events are not inconsistent with investment in high technology continuing to serve as an engine of strong productivity growth in the years ahead. As one might expect, the cyclical component of the growth of output per hour has slowed, but during the summer months output per hour advanced at a pace sufficiently impressive to affirm a definitely elevated underlying rate of structural productivity growth from the levels of a decade ago. Moreover, despite recent short-term earnings disappointments, many corporate managers appear not to have measurably altered their long-standing optimism on the future state of technology. At least this is the impression one gets from the persistent upward revision through most of this year of security analysts’ long-term earnings projections. Analysts, one must presume, obtain most of their insights from corporate managers, who are most intimately aware of the potential gains from technological synergies and networking economies. According to IBES, a Wall Street firm, the three- to five-year average earnings projections of more than a thousand analysts, though exhibiting some signs of flattening in recent months, have generally held firm. Such expectations, should they persist, bode well for continued capital deepening and sustained growth of structural productivity over the longer term. Admittedly, however, shifts in the growth of structural productivity are clearly visible only in retrospect. The adjustment to a more sustainable supply-demand alignment is occurring in the economy more broadly, as well as within some high-tech sectors. For some time, growth in aggregate demand exceeded even the productivity-enhanced expansion of potential supply. More recently, though, the pace of expansion of economic activity has moderated appreciably, in part as tighter financial conditions have had some impact on interest-sensitive areas of the economy. Homebuilding has declined over much of this year, though more recently demand appears to have largely stabilized in response to the decline in mortgage rates that has occurred since the spring. Motor vehicle sales continue to slip, however, reflecting, in part, the unwillingness of households and businesses, faced with today’s financial market conditions, to add to the stock of cars and trucks on the road at the pace observed over the past year. Growth in the demand for consumer durables generally appears to be shifting down after the rapid gains of the past few years. This softening, in turn, has fed back into reduced demand from a large segment of the so-called old economy that supplies consumer durable markets. Given the difficulty employers have had with building up their workforces in recent years, it is not surprising that the easing in demand growth since the spring has been reflected more in hours worked than jobs. As a consequence, labor markets have remained tight. But the recent increase in initial unemployment insurance claims and the level of insured unemployment may be an early harbinger of an easing of these conditions. In periods of transition from unsustainable to more modest rates of growth, an economy is obviously at increased risk of untoward events that would be readily absorbed in a period of boom. The sharp rise in energy prices, if sustained, is worrisome in this regard. As we learned from previous episodes, rising energy prices could engender risks to both inflation and economic activity. If accommodated by monetary policy, the jump in energy prices could spill over into general inflation and inflationary expectations, as was so evident in the 1970s. At the same time, the hike in the price of imported energy has acted, in effect, as a tax equivalent of roughly 1% of national income. Although there is as yet little evidence of the type of destabilizing inflationary pressures observed in the aftermath of previous oil price spikes or of exceptionally large restraint on consumer spending, Middle East tensions have heightened such risks. The most significant effect to date from higher energy prices appears to be on profit margins, where corporate businesses, constrained by competitive market forces, have not been able to raise prices to fully offset energy cost increases. We estimate that owing to the rise in oil, natural gas, and electric power prices, energy costs of nonfinancial, nonenergy, corporations have increased at a 40% annual rate since the spring of 1999. Apparently, most of the increase has eaten into the margins of domestic corporations outside the energy sector. With equity prices weakening in response to reduced earnings from higher costs and a more moderate pace of sales, the “wealth effect” that spurred consumer spending is being significantly attenuated. Moreover, high and rising equity prices had facilitated a good deal of financing for newer companies, both in the equity and bond markets. Widened spreads in the high-yield markets reflect, in part, a reduced potential for new equity issuance to support debt servicing. Higher costs of capital for these companies likely is exerting some restraint on overall business capital spending. Nonetheless, in the face of the energy price spike and the erosion of optimism in financial markets, consumer confidence, or sentiment, appears to be holding up reasonably well to date, though there have been some mixed signals of late. And although new orders for high-tech equipment have slowed from their torrid pace of earlier this year, order backlogs for such equipment continued to rise through October, and capital deepening and structural productivity growth continued apace. Still, in an economy that already has lost some momentum, one must remain alert to the possibility that greater caution and weakening asset values in financial markets could signal or precipitate an excessive softening in household and business spending. As might readily be expected, these changing economic dynamics, of necessity, are being reflected in our banking markets. Technological innovations played a key role in rendering decades-old banking laws and regulations obsolete. The relaxation of these regulations has, in turn, further reduced barriers to competition and accelerated the modernization of our financial system. That evolution, however, must continue to occur in a manner that preserves the fundamental soundness of the financial system and, in particular, the nation’s banks. History teaches us that a sound banking system, willing and able to take deposits and extend credit, is a prerequisite for the long-term health of the national economy. Securities markets alone will never be able to substitute for the extensive and detailed knowledge that bankers - especially community bankers - bring to the intermediation process. In addition, changes in regulation and supervision induced by technological innovations in information processing are just a small part of the sea change in banking. Far more important is that these technologies have made it possible for banks and other financial firms to adopt business models and to offer customers a range of products and services that literally would have been impossible only a few years ago. What has transpired in the banking industry has been just a microcosm of the sweeping changes in the economy at large in recent years - changes that also present both risks and opportunities to banks. Since the mid-1990s, the banking industry has enjoyed an unusually strong and steady growth of profits, coupled with an improving asset quality and vibrant loan growth, which reflected, in turn, the growing and innovative real economy. Bankers, however, were not immune from the same optimism that affected equity markets, leading, in some cases, to less-than-realistic assessments regarding borrowers’ ability to repay. In more recent periods, pockets of weakness have emerged, especially in large syndicated credits. I believe that these developments are inducing more realistic assessments of risks by banking organizations. For some institutions, lapses in risk evaluation have come at some cost as asset quality has deteriorated, net charge-offs have risen, and profitability has fallen. Recent reports by certain large banking organizations point to some further deterioration of asset quality. Fortunately, bankers have generally been well prepared to meet these developments. The industry’s base of earnings is historically strong and well diversified, and although credit costs as well as problem and classified assets have risen, they remain historically modest relative to assets and capital. Some bankers attribute the recent rise in problems to the relaxation of lending standards and terms of two or three years ago. During that period, supervisors had noted the risks posed by an overly optimistic outlook and by a lack of stress testing. However, while supervisors like to think that banking organizations are attentive to our every warning, there is nothing like actual loan downgrades and losses to focus management’s attention. One can hope that the practical experience provided by recent events will translate into better structured transactions and improved risk-adjusted pricing. Indeed, that does seem to be the case. For example, the Federal Reserve’s November Senior Loan Officer Opinion Survey suggests that lending standards and terms have continued to tighten in the wake of weakened borrower conditions. In addition, more than half the institutions indicated that they anticipated further tightening of standards and terms before the end of 2001. At the same time, it is important that the response of management to these concerns not be overdone. The rise in the problems that we are observing can in some sense be attributed to the kinds of overreaching typically experienced during strong economic periods, when downside scenarios are more challenging to visualize fully, when the ambitions of borrowers are at their height, and when competition for market share is an especially driving force. Though lenders will be viewing new transactions with greater caution than they did a couple of years ago, both bankers and their supervisors should now guard against allowing the pendulum to swing too far the other way by adopting policy stances that cut off credit to borrowers with credible prospects. Despite some isolated pronounced losses at a few large institutions, their effect on earnings so far has been modest in the aggregate. More widespread has been shrinking net interest margins arising from a rapid rise in liability costs, with more than a third of the largest bank holding companies experiencing declines in these margins of 25 basis points or more compared with a year ago. On the other hand, savings institutions have had less severe margin contractions, and smaller commercial banks have generally been able to keep their margins stable. The relatively moderate margin decline for savings institutions may reflect the success of their hedging programs. The more favorable margin trends at smaller commercial and savings institutions are also probably linked to their relatively higher levels of core deposits, which have been less sensitive to rising rates. That said, a growing dependence on wholesale funding is becoming an established trend for large and small institutions alike. As loan growth has greatly exceeded that of deposits, liquidity benchmark ratios such as loans-to-deposits have reached historic peaks. Although day-to-day decisions about wholesale versus retail funding may appear immaterial at the time, the effect of such decisions may gradually transform the overall liquidity and risk profile of an institution. It is crucial, therefore, that bank managers take stock of how their balance sheets have evolved - including the widening menu of choices available to customers on both sides of that balance sheet - and understand the accompanying implications. The freedom to manage your liability structure is still constrained by one remnant of 1930’s era banking regulation - the prohibition of interest on demand deposits. This is of particular concern to community bankers, of course, given that larger banks are offering interest to their customers through sweep accounts. Pending legislation modernizing the law would potentially help bolster deposit growth and open opportunities for other profitable customer relationships without the unproductive and costly circumventions of the existing statute. Capital, of course, is the key to maintaining stability in the midst of the risks of the evolving financial and economic landscapes. Recognizing that fact, regulators are working toward implementation of capital standards that are more risk-sensitive through the international Basel Capital Accord. That effort, however, is largely directed toward the diversity of risks facing the largest banks and will undoubtedly require an even greater level of complexity than the current standards. For those of you who already view the current capital framework as too complicated, let me offer some encouraging words. The US banking agencies published last month an advance notice of proposed rulemaking for a simplified capital framework that would apply to banks with less complex risk structures. The notice sets forth very broad options for a less burdensome standard, including the use of a simplified risk-based ratio, a stand-alone leverage ratio, or a leverage ratio modified to capture off-balance-sheet exposures. There are many pros and cons to each of these alternatives, and the banking agencies are eager to receive your views as well as your ideas for other alternatives. We are hopeful that this dialog will lead to a less burdensome, yet prudent rule for noncomplex institutions. In closing, the transition of the US economy to a more sustainable supply-demand relationship is posing challenges for businesses, banks, and monetary policymakers. How well banks perform under these conditions will depend on their ability to continuously reevaluate previously held assumptions and adapt to change. Fortunately, US banking organizations of all sizes have the right tools to thrive in this environment, in the form of improved risk management techniques, more diverse earnings, and strong capital bases. I am confident that banking organizations will continue to monitor and respond to risks while meeting customer demands in a way that strengthens our underlying financial structure in the years to come.
|
board of governors of the federal reserve system
| 2,000 | 12 |
Remarks by Mr Roger W Ferguson, Jr, Vice Chairman of the Board of Governors of the US Federal Reserve System, at the Rochester Institute of Technology, Rochester, New York on 6 December 2000.
|
Roger W Ferguson: Technology, Macroeconomics, and Monetary Policy in the United States Remarks by Mr Roger W Ferguson, Jr, Vice Chairman of the Board of Governors of the US Federal Reserve System, at the Rochester Institute of Technology, Rochester, New York on 6 December 2000. * * * Thank you for inviting me to the Presidential Colloquium at Rochester Institute of Technology. This is a particularly appropriate place to discuss the effects of technological change on the economy and some of the implications for monetary policy. As always, the views I will be expressing are my own and are not necessarily shared by other members of the Board of Governors or the Federal Open Market Committee. Let me start with a brief review of the extraordinary performance of the U.S. economy over the past five years. Since 1995, real gross domestic product has grown, on average, more than 4-1/2 percent per year. This pace is significantly above that in the previous five years, and you have to go back to the 1960s to find even closely comparable periods of consistently robust economic expansion. In this environment, the unemployment rate has fallen to 4 percent, and the underlying rate of price inflation has slowed, on net, despite very high rates of resource utilization. Even the most optimistic of forecasters could not have anticipated such a favorable confluence of economic events. Productivity Growth and Cost Reductions So, what happened? As a policymaker, I’d like to think that well-executed monetary and fiscal policies--each focused importantly on their respective long-run goals of achieving price stability and reining in deficit spending--played some role in creating economic conditions that fostered non-inflationary economic growth. Our economy has also benefited from past actions by the government to deregulate industries. The removal of unnecessary government regulation started more than twenty years ago during the Administration of President Ford and gathered momentum during the Carter years. It has altered the business landscape by allowing, indeed forcing, businesses to focus more clearly on a more competitive marketplace with fewer constraints and increased flexibility. But the dominant force of late appears to have been a significant increase in the rate of productivity growth: Output per hour in the nonfarm business sector--a conventional measure of productivity--has increased at an annual rate of almost 3 percent since 1995, well above the pace earlier in the decade. Cyclical forces such as the inability of businesses to add to their payrolls as rapidly as they would have liked in response to the rise in demand have probably played some role in these efficiency gains. But I suspect that longer-term, structural changes, reflecting the boom in capital spending and the revolution in information technology, probably have been more important. Let me turn to the evidence on this point. Technology Change and Productivity Growth Bob Solow--the MIT economist who won the Nobel Prize in economics for his work on the theory of economic growth--once quipped that you can see computers everywhere except in the productivity statistics. A few years ago that situation began to change, and we now have strong evidence that the faster productivity growth our economy has experienced is in fact due partly to newer technologies. Research by economists Steve Oliner and Dan Sichel of the Federal Reserve Board staff sheds some light on the sources of this faster productivity growth. About 1/2 percentage point of the increase in productivity growth over the 1995-99 period can be attributed to so-called "capital deepening," most of which reflected greater spending by businesses on computers, software, and communications equipment. The high (and rising) levels of business investment raised the amount of capital per worker and thereby boosted productivity. Another 1/2 percentage point of the pickup in productivity growth reflected technological innovations in the actual production of computer hardware and semiconductors as well as better management--perhaps assisted by these high-tech investments--of the nation’s capital and labor resources. Oliner and Sichel estimate that the consolidated influences of high-tech investments account for about two-thirds of the acceleration in productivity since 1995. This research supports the view that fundamental changes are underway in our economy. What’s So Special about this Capital? While it is interesting to note that trend productivity has picked up and that high-tech investments are the source of the acceleration, by now these are not new observations. Perhaps at this stage it is more useful to explore more deeply this positive "shock" to the ability of our economy to produce goods and services. What is so special about computers and other information technologies that they can have such an impact on our economy? Let me highlight three special characteristics of high-tech equipment. First, computers and communications equipment depreciate at a very rapid pace. The current best estimate is that computers probably depreciate about 30 percent annually, although that estimate might be low, while other equipment probably depreciates at a rate of less than 15 percent annually. Therefore, computers are retired, on average, after three years, and the useful life for other equipment is about seven years. Firms must invest in computers at a faster rate than that for other forms of capital just to maintain a given level of the capital stock. The rapid replacement of high-tech capital means that technological progress becomes "embodied" in the capital stock at a faster rate than is the case for longer-lived assets. The second feature of high-tech equipment that sets it apart from other classes of capital is the sensitivity of its demand to fluctuations in the cost of capital. Economists have debated for decades about the magnitude of cost-of-capital effects on traditional capital goods. A past consensus was that there probably was a cost-of-capital effect but that it was small and very difficult to identify empirically. A somewhat different conclusion has arisen lately when the same basic models of investment are applied to spending on computers alone. The latest research shows that computers are quite sensitive to movements in the cost of capital, and as a result of the 20 percent per annum decline in relative computer prices, the cost of this type of capital fell rapidly in the past decade. This combination of a high price elasticity and a rapidly declining price led to the boom in high-tech investment. A third characteristic of high-tech investment is the magnitude of "external" or "spillover" effects that it generates. High-tech equipment generates benefits not only to the owner of the machine but to other agents in the economy as well. I am thinking in particular about so-called network effects--that is, linking computers together makes possible larger productivity gains than do computers operated as stand-alone units. Although difficult to measure, such network effects certainly have stimulated the demand for high-tech equipment and have helped to speed up the dispersion of new technologies. Supporting Structural Changes The technological changes inspired by investments in computers have enhanced the ability of businesses to reduce their operating expenses. In many industries, investments in information technologies have helped firms to cut back on the volume of inventories that they hold as a precaution against glitches in their supply chain or as a hedge against unexpected increases in aggregate demand. Product development costs have probably also been reduced through the use of better computer hardware and software, and new communications technologies have increased the speed with which firms can share information--both internally and with their customers and suppliers. This is the intersection of macroeconomics and management. Many business observers now believe that these newer technologies are not only reducing the cost of transforming inputs into outputs but also decreasing "interaction costs," the costs incurred in getting different people and companies to work together to exchange goods and services. Obviously, the line between "transformation" and "interaction" is not clear, but consultants who have studied this topic believe that these interaction costs account for 55 percent of all labor costs, with some industries, such as financial services, estimated to have interaction costs as high as 70 percent of labor costs. I cannot verify these numbers, but the general concept seems useful. Largely as a result of the increase in productivity in the recent past, we have experienced a remarkable stability in unit labor costs. During the past five and a half years, unit labor costs for nonfinancial corporations, which are the most accurately measured, increased an average of 0.2 percent at an annual rate. This compares quite favorably with the experience in the preceding ten years of a 2.2 percent annual rate of increase. If in fact "interactions" account for 55 percent of labor costs, this relatively flat trend in unit labor cost increases is consistent with the concept that the newer technologies are allowing easier, less labor-intensive, interactions. Importantly, given the high rate of depreciation and the steep declines in costs of high-tech equipment, these savings in unit labor costs are not being undermined by offsetting increases in unit nonlabor costs. Moreover, given intense competition and the resultant lack of pricing "leverage," ongoing programs to reduce costs have become a key part of corporate strategies to maintain or improve profit margins. The focus on cost reduction has worked to head off the development of inflationary pressures in this expansion. The Future Path for Productivity Improvements But technological waves ebb and flow, and it is natural to ask whether we can count on such rapid productivity growth in the future. On this score, I am cautiously optimistic, but I recognize both that forecasting technology is extremely difficult and that there will be occasional bumps in the road. Let me explain my reasons for caution and optimism. The risk that productivity growth might moderate centers on the high-tech sector, computers and communications equipment, and the associated relative price declines. Historical patterns suggest that such narrowly based productivity increases might not continue, and therefore caution is in order. However, there are two reasons to be optimistic. First, the recent burst in productivity growth seems to be more a product of changing technology than of transient business cycle influences, and as a result, there is less chance of a "payback period" of particularly sluggish productivity growth. Second, computer industry experts, including those in the semiconductor industry, do not indicate that the industry has exhausted its potential to produce faster and cheaper computers. Similarly, business leaders suggest that they are still taking advantage of the advances in computing power at lower costs to find new and productive uses of newer technologies. The Macroeconomic Implications of Faster Productivity Growth Theory teaches us that the step-up in the growth rate of technological change certainly has important implications for economic activity and inflation. The main reason policymakers and economists are interested in the growth rate of productivity is that it helps us to understand the economy’s potential to supply goods and services. The effects on the economy’s ability to produce goods and services are clear, but theory predicts that a new higher level of productivity growth would also affect the demand for goods and services. The most immediate effects would be on capital investment, as we have seen. A more rapid pace of technological change raises the real rate of return on new investments--perhaps significantly. Put another way, a more rapid pace of technological change makes investments in capital goods embodying the new technology more profitable. When businesses recognize the new technological possibilities, capital spending accelerates to take advantage of the new profit opportunities. The employment and income generated by business spending on capital goods boosts consumer spending and sets off another round of investment spending. Typically referred to by economists as "multiplier-accelerator" effects, such processes would continue as long as the real rate of return on a new capital project exceeded the real cost for capital for that project. Through this process, an innovation on the supply side of the economy generates a comparable increase in aggregate demand. It is important to emphasize that higher productivity growth translates into higher real income growth for employees. This added income is seen most clearly in the higher wages paid to that growing number of workers whose cash compensation is tied to company performance. In addition, for those workers who have been granted stock options, higher profits today and the potential for further increases tomorrow translates into higher stock prices for their company and ultimately an increase in their overall compensation. But real incomes should increase even for workers whose compensation is not directly linked to company performance, as profitable business opportunities bolster the demand for scarce labor. Theory also teaches that the increase in the rate of return on capital--even if generated by a rise in the growth rate of technical change--ultimately requires an increase in real market interest rates. Market interest rates must rise in order to maintain equilibrium between the higher demand for investment funds and the supply of investment funds. And, indeed, we have seen that market interest rates, particularly for corporate issuers, have risen steadily for the last year or so. Interestingly, rates of return for forms of capital other than computer equipment, including both structures and non-computer equipment, either have not increased or have not risen as much as the rates of return on their high-tech counterparts. It is therefore possible that, during this period, investment flows have been reallocated away from firms producing traditional capital goods and toward firms and industries that make high-tech goods and services. This somewhat abstract description of the effects of a step-up in the growth rate of technical change bears a striking resemblance to the developments of recent years in labor markets, prices of goods and services, capital investments, and fixed-income markets. But there’s still an element missing: the stock market. A higher rate of technical change that raises the productivity and hence the profitability of capital should elevate the value of equities. Since equity prices reflect market expectations of future cash flow and dividends, any adjustment in profit expectations can and does lead to a resetting of equity prices. Are stocks today overvalued, correctly valued, or undervalued? I certainly do not know, and I am not aware of anyone who does. As a result, I believe that it would be unwise--and indeed impossible--for the Federal Reserve to target specific levels of valuations in equity markets. However, equity markets obviously do have spillover effects on the real economy and, thus, need to be considered in assessing the aggregate balance of supply and demand. Given the efficiency and forward-looking nature of financial markets, even expected future technical innovations will have an immediate effect on equity valuations. Equity values, in turn, can influence consumer behavior. As you know, economists often speak of the "wealth effect," and econometric modeling indicates that consumers eventually tend to raise the level of their spending between 2 and 5 cents for every incremental dollar of wealth. As a consequence, equity valuations can have a noticeable effect on consumption and on macroeconomic performance. To put a rough number on these influences, simulations by the Board staff using our econometric model of the economy suggest that wealth generated in the equity markets over the last four years added about 1 percentage point to the growth rate of real GDP. Of late, equity markets have given up some of their gains. However, economists who have studied the topic generally think that an impact of a change in stock market generated wealth on consumption begins to build in the first year and may take two to three years to be fully felt. Additionally, equity markets are a source of investment capital, and valuations in the stock market are one determinant of the cost of capital for businesses. External financing conditions, including equity valuations, are important because recent investments have increasingly been financed from external sources. External funds raised now account for about 20 percent of nominal capital expenditures-close to the highs of the past two decades. Monetary Policy and the "New" Economy As I have said many times before, uncertainty about productivity trends poses a major challenge in the design and implementation of monetary policy. As you can imagine, it is very difficult to infer the true structure of the economy through the interpretation of the twists and turns of incoming economic data. How do we know, for example, if unexpected developments are just temporary movements away from stable longer-run relationships or are manifestations of changes in the underlying economic structure? In many cases, this judgment is difficult to make with much confidence even considerably after the fact. In the meantime, we must bear in mind that the statistical relationships we work with, embodied in our econometric models, are only loose approximations of the underlying reality. The considerable uncertainty regarding statistical constructs such as the "natural" rate of unemployment or the "sustainable" rate of growth of the economy suggests, in my judgment, the need to downplay forecasts of inflation based solely on those variables. Some fog always obstructs our vision, but when the structure of the economy is changing, the fog is considerably denser than at other times. What should be done when such uncertainties seem particularly acute? When we suspect that our understanding of the macroeconomic environment has deteriorated, as evidenced by strings of surprises difficult to reconcile with our earlier beliefs, I think that the appropriate response is to rely less upon the future predicted by the increasingly unreliable old models and more upon inferences from the more recent past. That means we should weight incoming data more heavily than data from decades past in trying to make judgments about the new economy and, of course, act appropriately when the evidence becomes clear. It also is important to be aware of the potential for unanticipated developments to emerge that might have implications for policymaking. The rate of growth of our economy has stepped down from the unsustainable pace of earlier this year. During such a period, potential risks emerge more clearly. First, we must be mindful that an unexpected slowdown might occur in the growth of productivity. As I said, I am cautiously optimistic that the rapid pace of productivity growth can be extended. However, we now know that an unexpected and unrecognized slowdown in productivity growth occurred in 1973. The causes are still debated, but we know that the slowdown contributed to "stagflation," which emerged as employees demanded increased compensation, based on unrealistically high expectations of productivity growth and gradually rising inflation expectations, and employers granted those increases. To maintain profit margins, businesses then passed on those cost increases through to prices. This passthrough occurred as the rate of growth in the economy subsided. This is the reverse of the good news that we have experienced in this expansion. The second risk to good performance is that the investment boom, at least in some sectors, may overshoot. We are not only in the longest expansion in the history of our nation but also in the longest investment boom. Expectations of future returns from capital may not materialize, and companies may find that they have over-invested in capital stock. Other investment booms have ended with a pullback in investment that has slowed growth sharply, and we should be mindful that such an outcome is not impossible. Indeed, the recent releveling of the stock prices of some high-tech companies may suggest that we are entering a period of reduced optimism about future profits and less rapid growth in business investment. The third risk is that the capital inflows from abroad that have been funding our domestic investments may dry up. The elevated stock market has reduced household savings. Net government saving has increased greatly, in the form of the surplus at the federal, state, and local levels, but as a nation we also rely on capital inflows from overseas. Capital inflows, as you know, are the counterpart of our record current account deficit. The gap between domestic savings and investment is large and growing, and if the inflow of foreign capital reversed suddenly, the consequences for our economy would be noticeable. A fourth risk arises from ongoing adjustments in financial markets to the perception of a riskier economic environment. Over the course of this year, commercial banks have tightened their lending standards, and quality spreads have increased in the bond market--especially in the high-yield sector. Activity in the IPO market has subsided as equity investors have turned away from riskier ventures. Taking into account also the decline in equity prices since the spring and the rise in the foreign exchange value of the dollar, financial markets are imposing more restraint on the economy than they have in recent years. A reassessment of risks is a natural and desirable byproduct of financial market adjustments, and of returning to more sustainable economic conditions. There is always a danger, however, that participants will overreact in such a period of adjustment. I do not today see such an overreaction, but we have to be aware that markets have turned excessively pessimistic in the past, with negative effects on economic activity. Similarly, although investment growth appears to have slowed, it is still rapid by historical standards and the dollar remains firm. Thus, I do not see, at this stage, evidence of a marked drop off in investment or of a sudden reversal in capital flows. However, it is prudent to be mindful of these risks in this transition period. Conclusion In conclusion, let me remind you that, while these are challenging times for monetary policymakers and financial market participants, the U.S. economy is enjoying a period of unprecedented prosperity. Our job at the Federal Reserve is to do our utmost to produce a stable economic environment of maximum sustainable growth without inflation so that these trends can continue. To produce such an environment we must be equally vigilant against the risk of either an extended period of growth unacceptably below potential, or a resurgence of inflation.
|
board of governors of the federal reserve system
| 2,000 | 12 |
Remarks by Mr Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the Bay Area Council 2001 Outlook Conference, held in Oakland, California on 12 January 2001
|
Roger W Ferguson, Jr: US Domestic Macroeconomic Developments - Past, Present, and Future Remarks by Mr Roger W Ferguson, Jr, Vice-Chairman of the Board of Governors of the US Federal Reserve System, at the Bay Area Council 2001 Outlook Conference, held in Oakland, California on 12 January 2001. * * * Thank you for inviting me to the Bay Area Council 2001 Outlook Conference. As always, the views I will be expressing are my own and are not necessarily shared by other members of the Board of Governors or of the Federal Open Market Committee. Before I discuss the current economic environment and recent monetary policy actions, let me set the stage with a brief review of the extraordinary performance of the U.S. economy in the second half of the 1990s. From 1995 to 1999, real gross domestic product grew, on average, about 4 percent per year. This pace was significantly above that in the previous five years, and you would have to go back to the 1960s to find even closely comparable periods of consistently robust economic expansion. In this environment, the unemployment rate fell to 4 percent, and the underlying rate of price inflation slowed, on net, despite very high rates of resource utilization. Even the most optimistic of forecasters could not have anticipated such a favorable confluence of economic events. Productivity Growth and Cost Reductions So, how did this confluence of positive events come about? As a policymaker, I’d like to think that wellexecuted monetary and fiscal policies--each focused importantly on their respective long-run goals of achieving price stability and reining in deficit spending-- played some role in creating economic conditions that fostered noninflationary economic growth. Our economy has also benefited from past actions by the government to deregulate industries. The removal of unnecessary government regulation started more than twenty years ago, during the Administration of President Ford, and gathered momentum during the Carter years. It has altered the business landscape by allowing, indeed forcing, businesses to focus more clearly on a more competitive marketplace with fewer constraints and increased flexibility. But the dominant force of late appears to have been a significant increase in the rate of productivity growth: Output per hour in the nonfarm business sector--a conventional measure of productivity-increased at an annual rate of almost 3 percent between 1995 and 1999, well above the pace earlier in the decade. Cyclical forces such as the inability of businesses to add to their payrolls as rapidly as they would have liked in response to the rise in demand probably played some role in these productivity gains. But probably more important, I suspect, were longer-term, structural changes arising from the boom in capital spending and the revolution in information technology. Let me turn to the evidence on this point. Technology Change and Productivity Growth Bob Solow--the MIT economist who won the Nobel Prize in economics for his work on the theory of economic growth--once quipped that you can see computers everywhere except in the productivity statistics. A few years ago that situation began to change, and we now have strong evidence that the faster productivity growth our economy has experienced is, in fact, due partly to newer technologies. Research by Federal Reserve Board economists Steve Oliner and Dan Sichel sheds some light on the sources of this faster productivity growth. The high (and rising) levels of business investment raised the amount of capital per worker and thereby boosted productivity. About 1/2 percentage point of the increase in productivity growth over the 1995-99 period can be attributed to this so-called “capital deepening,” most of which reflected greater spending by businesses on computers, software, and communications equipment. Another 1/2 percentage point of the pickup in productivity growth reflected technological innovations in the actual production of computer hardware and semiconductors as well as better management--perhaps assisted by these high-tech investments--of the nation’s capital and labor resources. Oliner and Sichel estimate that the consolidated influences of high-tech investments account for about two-thirds of the acceleration in productivity since 1995. This research supports the view that fundamental changes are under way in our economy. What’s So Special about this Capital? That trend productivity has picked up and that high-tech investments are the source of the acceleration are important facts, but by now they are not new observations. Perhaps at this stage it would be useful to explore in greater detail this positive “shock” to the ability of our economy to produce goods and services. What is so special about computers and other information technologies that they can have such a strong influence on our economy? Let me highlight three special characteristics of high-tech equipment. First, computers and communications equipment depreciate at a very rapid pace. The current best estimate is that computers probably depreciate about 30 percent annually, although that estimate might be low, while other equipment probably depreciates at a rate of less than 15 percent annually. Therefore, computers are retired, on average, after three years, and the useful life for other equipment is about seven years. Firms must invest in computers at a faster rate than that for other forms of capital just to maintain a given level of the capital stock. The rapid replacement of high-tech capital means that technological progress becomes “embodied” in the capital stock at a faster rate than is the case for longer-lived assets. The second feature of high-tech equipment that sets it apart from other classes of capital is the sensitivity of its demand to fluctuations in the cost of capital. Economists have debated for decades about the magnitude of cost-of-capital effects on traditional capital goods. A past consensus was that the cost-of-capital effect was small and very difficult to identify empirically. A somewhat different conclusion has arisen lately when the same basic models of investment are applied to spending on computers alone. The latest research suggests that computers are quite sensitive to movements in the cost of capital. The combination of an apparently high price elasticity and a rapid decline in relative computer prices -- 20 percent per year over the past decade -- likely led to the boom in high-tech investment. A third characteristic of high-tech investment is the magnitude of “external” or “spillover” effects that it generates. High-tech equipment generates benefits not only to the owner of the machine but to other agents in the economy as well. I am thinking in particular about so-called network effects--that is, linking computers together makes possible larger productivity gains than do computers operated as stand-alone units. Although difficult to measure, such network effects certainly have stimulated the demand for high-tech equipment and have helped to speed up the dispersion of new technologies. Supporting Structural Changes The technological changes inspired by investments in computers have enhanced the ability of businesses to reduce their operating expenses. In many industries, investments in information technologies have helped firms to cut back on the volume of inventories that they hold as a precaution against glitches in their supply chain or as a hedge against unexpected increases in aggregate demand. Product development costs have probably also been reduced through the use of better computer hardware and software, and new communications technologies have increased the speed with which firms can share information--both internally and with their customers and suppliers. This is the intersection of macroeconomics and management. Many business observers now believe that these newer technologies are not only reducing the cost of transforming inputs into outputs but also decreasing “interaction costs,” the costs incurred in getting different people and companies to work together to exchange goods and services. Obviously, the line between “transformation” and “interaction” is not clear, but consultants who have studied this topic believe that these interaction costs account for 55 percent of all labor costs, with some industries, such as financial services, estimated to have interaction costs as high as 70 percent of labor costs. I cannot verify these numbers, but the general concept seems useful. Largely as a result of the increase in productivity in the recent past, we have experienced a remarkable stability in unit labor costs. During the past five and a half years, unit labor costs for nonfinancial corporations, which are the most accurately measured, increased an annual average of 0.2 percent. This compares quite favorably with the experience in the preceding ten years of a 2.2 percent annual rate of increase. If in fact “interactions” account for 55 percent of labor costs, this relatively flat trend in unit labor costs is consistent with the concept that the newer technologies are allowing easier, less labor-intensive, interactions. Importantly, given the high rate of depreciation and the steep declines in costs of high-tech equipment, these savings in unit labor costs are not being undermined by offsetting increases in unit nonlabor costs. Moreover, given intense competition and the resultant lack of pricing “leverage,” ongoing programs to reduce costs have become a key part of corporate strategies to maintain or improve profit margins. The focus on cost reduction has worked to head off the development of inflationary pressures in this expansion. The Macroeconomic Implications of Faster Productivity Growth Theory teaches us that the step-up in the growth rate of technological change certainly has important implications for economic activity and inflation. The main reason policymakers and economists are interested in the growth rate of productivity is that it helps us to understand the economy’s potential to supply goods and services. The effects on the economy’s ability to produce goods and services are clear, but theory predicts that a higher level of productivity growth would also affect the demand for goods and services. The most immediate effects would be on capital investment, as we have seen. Businesses recognize the new technological possibilities, and capital spending accelerates to take advantage of the new profit opportunities. The employment and income generated by business spending on capital goods boosts consumer outlays and sets off another round of investment spending. Through this process, an innovation on the supply side of the economy generates a comparable increase in aggregate demand. It is important to emphasize that higher productivity growth translates into higher real income growth for employees. This added income is seen most clearly in the higher wages paid to that growing number of workers whose cash compensation is tied to company performance either directly or through stock options. But real incomes should increase even for workers whose compensation is not directly linked to company performance, as profitable business opportunities bolster the demand for scarce labor. Theory also teaches that the increase in the rate of return on capital--even if generated by a rise in the growth rate of technical change--ultimately requires an increase in real market interest rates. All else equal, market interest rates must rise in order to maintain equilibrium between the higher demand for investment funds and the supply of investment funds. This somewhat abstract description of the effects of a step-up in the growth rate of technical change bears a striking resemblance to the developments of recent years in labor markets, prices of goods and services, capital investments, and fixed-income markets. But there’s still an element missing: the stock market. A higher rate of technical change that raises the productivity and hence the profitability of capital should elevate the value of equities. Since equity prices reflect market expectations of future cash flow and dividends, any adjustment in profit expectations can and does lead to a resetting of equity prices. Are stocks today overvalued, correctly valued, or undervalued? I certainly do not know, and I am not aware of anyone who does. As a result, I believe that it would be unwise--and indeed impossible--for the Federal Reserve to target specific levels of valuations in equity markets. However, equity markets obviously do have spillover effects on the real economy and, thus, need to be considered in assessing the aggregate balance of supply and demand. Given the efficiency and forward-looking nature of financial markets, even expected future technical innovations will have an immediate effect on equity valuations. Equity values, in turn, can influence consumer behavior. As you know, economists often speak of the "wealth effect," and econometric modeling indicates that consumers eventually tend to raise the level of their spending by 2 to 5 cents for every incremental dollar of wealth. As a consequence, equity valuations can have a noticeable effect on consumption and on macroeconomic performance. To put a rough number on these influences, simulations by the Board staff using our econometric model of the economy suggest that wealth generated in the equity markets over the past four years added about 1 percentage point to the growth rate of real GDP. Of late, equity markets have given up some of their gains. However, economists who have studied the topic generally think that the consumption effect of a change in wealth generated by the stock market begins to build in the first year and may take two to three years to be fully felt. Additionally, equity markets are a source of investment capital, and valuations in the stock market are one determinant of the cost of capital for businesses. External financing conditions, including equity valuations, are important because recent investments have increasingly been financed from external sources. External funds raised now account for about 20 percent of nominal capital expenditures-close to the highs of the past two decades. The Year 2000: A Period of Transition Productivity-led growth continued through the first half of 2000, with real GDP racing ahead at a breakneck 5-1/4 percent annual rate. The growth of aggregate demand was clearly outpacing even the most optimistic projections for the growth of potential supply, and the dangers of an increase in inflation pressures were higher. In order to promote better balance between aggregate demand and potential supply and to contain inflationary pressures, the Federal Open Market Committee took additional firming actions, raising the benchmark federal funds rate 1 percentage point between February and May. Even with these increases, the Committee felt that the balance of risks to the economic outlook still was tipped toward greater inflation. Signs of adjustment to a more sustainable level of growth of aggregate demand began to emerge over the course of the summer. But with no evidence either of a significant deterioration in price trends or a serious softening in economic activity, the FOMC entered a period of watchful waiting. As you know, monetary policy works with long and variable lags, and the Committee believed it appropriate to monitor how its earlier actions were affecting the economy before taking additional steps. As the second half of the year wore on, the data confirmed that a slowdown was under way. But for a while its extent and likely duration were still open questions. By the time of the November meeting of the FOMC, the statistical evidence and developments in financial markets suggested that the growth of aggregate demand had slowed to a more sustainable pace that might even be a bit below the growth rate of the economy’s potential to produce. But with the labor markets quite tight and energy prices rising, the Committee still saw a continuing risk of heightened inflation. Economic conditions changed dramatically late in the year, though. By the December meeting, the “hard” statistical data on economic expansion, which are available with a lag, showed some signs of weakness, but aggregate economic activity still appeared to be expanding moderately in the fourth quarter. In contrast, other, anecdotal indicators turned decidedly more bearish. For example, warnings of slower sales and earnings growth resulted in substantial markdowns in the valuations of many leading high-tech companies, and consumer confidence swooned in December. Were these the signs of the inevitable “bumps in the road” that occur during a period of transition from rapid growth to a more sustainable pace of expansion, or were they the early warning signs of a more serious slackening in demand? The FOMC wrestled with this issue at its December meeting. The risks clearly had shifted toward a period of subpar growth, but as of the middle of December, the evidence in hand did not appear sufficient to justify a rate cut. Following the December meeting of the FOMC, Committee members remained on “high alert” for signs of additional slowing in the pace of economic activity. As 2001 began, evidence accumulated that sales and production were weakening. The National Association of Purchasing Managers report suggested widespread softness among manufacturers. Auto sales and production were taking a hit. Revisions to the data on orders and shipments of nondefense capital goods provided hard evidence of a slowdown in business spending on high-tech capital goods, and more companies announced disappointing sales and earnings. Initial claims for unemployment insurance moved up further at the end of December, and layoff announcements increased. The reports of consumer spending at yearend were disappointing. In light of these factors, the FOMC acted on January 3 to cut the federal funds rate 50 basis points. Taking a policy action between meetings surprised some observers, and some have asked, “What does the Fed know that we don't?” The answer to this question now, as in nearly every situation in which we change policy, is--"very little, if anything." Although I generally favor making policy decisions at our regularly scheduled meeting, we must remain flexible and be prepared to respond quickly and firmly to developments that deviate significantly from our expectations. 2001 Outlook Although some fog always surrounds the outlook, there is more than the usual amount of uncertainty at this juncture about the economic future. Private sector forecasters have been marking down their forecasts in response to incoming data, and most now fall somewhere between 2 and 3 percent for real GDP growth this year. Clearly, demand has weakened faster than most businesses anticipated, and inventories have become uncomfortably high in some sectors. But businesses are moving fast to adjust levels of production. And final demand should be supported by the lower interest rates put in place by the financial markets and the Federal Reserve. Consequently, I would expect a period of notable weakness early in 2001, followed by a pickup in activity. Despite the sharp correction in “new economy” stock prices, I believe that the underlying technical advances I discussed earlier in this talk will continue and will provide considerable support over time for business and consumer spending. In its most recent announcement, the FOMC indicated that risks remain weighted toward economic weakness. While the scenario I outlined seems the most likely to occur, I am not sure what level of interest rates will be associated with it. This depends on a number of developments, including the evolution of financial markets and investor appetite for risk. Conclusion Obviously the stance of monetary policy will play an important role in shaping the eventual outcome. In recent weeks the Committee’s outlook changed rapidly as a result of incoming data and anecdotal reports, demonstrating the importance of our constant and intense scrutiny of the economy. The Federal Reserve will continue to analyze the incoming information carefully and will act prudently and forcefully to provide the monetary and financial conditions that will foster price stability and promote sustainable growth in output.
|
board of governors of the federal reserve system
| 2,001 | 1 |
Testimony of Mr Alan Greenspan, Chairman of the Federal Reserve Board on the occasion of the Federal Reserve Board's semi-annual monetary policy report to the Committee on Banking, Housing, and Urban Affairs, U.S. Senate on 13 February 2001
|
Alan Greenspan: Semi-annual monetary policy report to the US Congress Testimony of Mr Alan Greenspan, Chairman of the Federal Reserve Board on the occasion of the Federal Reserve Board’s semi-annual monetary policy report to the Committee on Banking, Housing, and Urban Affairs, U.S. Senate on 13 February 2001. * * * I appreciate the opportunity this morning to present the Federal Reserve's semi-annual report on monetary policy. The past decade has been extraordinary for the American economy and monetary policy. The synergies of key technologies markedly elevated prospective rates of return on high-tech investments, led to a surge in business capital spending, and significantly increased the underlying growth rate of productivity. The capitalization of those higher expected returns boosted equity prices, contributing to a substantial pickup in household spending on new homes, durable goods, and other types of consumption generally, beyond even that implied by the enhanced rise in real incomes. When I last reported to you in July, economic growth was just exhibiting initial signs of slowing from what had been an exceptionally rapid and unsustainable rate of increase that began a year earlier. The surge in spending had lifted the growth of the stocks of many types of consumer durable goods and business capital equipment to rates that could not be continued. The elevated level of light vehicle sales, for example, implied a rate of increase in the number of vehicles on the road hardly sustainable for a mature industry. And even though demand for a number of high-tech products was doubling or tripling annually, in many cases new supply was coming on even faster. Overall, capacity in high-tech manufacturing industries rose nearly 50 percent last year, well in excess of its rapid rate of increase over the previous three years. Hence, a temporary glut in these industries and falling prospective rates of return were inevitable at some point. Clearly, some slowing in the pace of spending was necessary and expected if the economy was to progress along a balanced and sustainable growth path. But the adjustment has occurred much faster than most businesses anticipated, with the process likely intensified by the rise in the cost of energy that has drained business and household purchasing power. Purchases of durable goods and investment in capital equipment declined in the fourth quarter. Because the extent of the slowdown was not anticipated by businesses, it induced some backup in inventories, despite the more advanced just-in-time technologies that have in recent years enabled firms to adjust production levels more rapidly to changes in demand. Inventory-sales ratios rose only moderately; but relative to the levels of these ratios implied by their downtrend over the past decade, the emerging imbalances appeared considerably larger. Reflecting these growing imbalances, manufacturing purchasing managers reported last month that inventories in the hands of their customers had risen to excessively high levels. As a result, a round of inventory rebalancing appears to be in progress. Accordingly, the slowdown in the economy that began in the middle of 2000 intensified, perhaps even to the point of growth stalling out around the turn of the year. As the economy slowed, equity prices fell, especially in the high-tech sector, where previous high valuations and optimistic forecasts were being reevaluated, resulting in significant losses for some investors. In addition, lenders turned more cautious. This tightening of financial conditions, itself, contributed to restraint on spending. Against this background, the Federal Open Market Committee (FOMC) undertook a series of aggressive monetary policy steps. At its December meeting, the FOMC shifted its announced assessment of the balance of risks to express concern about economic weakness, which encouraged declines in market interest rates. Then on January 3, and again on January 31, the FOMC reduced its targeted federal funds rate 1/2 percentage point, to its current level of 5-1/2 percent. An essential precondition for this type of response was that underlying cost and price pressures remained subdued, so that our front-loaded actions were unlikely to jeopardize the stable, low inflation environment necessary to foster investment and advances in productivity. The exceptional weakness so evident in a number of economic indicators toward the end of last year (perhaps in part the consequence of adverse weather) apparently did not continue in January. But with signs of softness still patently in evidence at the time of its January meeting, the FOMC retained its sense that the risks are weighted toward conditions that may generate economic weakness in the foreseeable future. Crucial to the assessment of the outlook and the understanding of recent policy actions is the role of technological change and productivity in shaping near-term cyclical forces as well as long-term sustainable growth. The prospects for sustaining strong advances in productivity in the years ahead remain favorable. As one would expect, productivity growth has slowed along with the economy. But what is notable is that, during the second half of 2000, output per hour advanced at a pace sufficiently impressive to provide strong support for the view that the rate of growth of structural productivity remains well above its pace of a decade ago. Moreover, although recent short-term business profits have softened considerably, most corporate managers appear not to have altered to any appreciable extent their long-standing optimism about the future returns from using new technology. A recent survey of purchasing managers suggests that the wave of new on- line business-to-business activities is far from cresting. Corporate managers more generally, rightly or wrongly, appear to remain remarkably sanguine about the potential for innovations to continue to enhance productivity and profits. At least this is what is gleaned from the projections of equity analysts, who, one must presume, obtain most of their insights from corporate managers. According to one prominent survey, the three- to five-year average earnings projections of more than a thousand analysts, though exhibiting some signs of diminishing in recent months, have generally held firm at a very high level. Such expectations, should they persist, bode well for continued strength in capital accumulation and sustained elevated growth of structural productivity over the longer term. The same forces that have been boosting growth in structural productivity seem also to have accelerated the process of cyclical adjustment. Extraordinary improvements in business-to- business communication have held unit costs in check, in part by greatly speeding up the flow of information. New technologies for supply-chain management and flexible manufacturing imply that businesses can perceive imbalances in inventories at a very early stage--virtually in real time--and can cut production promptly in response to the developing signs of unintended inventory building. Our most recent experience with some inventory backup, of course, suggests that surprises can still occur and that this process is still evolving. Nonetheless, compared with the past, much progress is evident. A couple of decades ago, inventory data would not have been available to most firms until weeks had elapsed, delaying a response and, hence, eventually requiring even deeper cuts in production. In addition, the foreshortening of lead times on delivery of capital equipment, a result of information and other newer technologies, has engendered a more rapid adjustment of capital goods production to shifts in demand that result from changes in firms' expectations of sales and profitability. A decade ago, extended backlogs on capital equipment meant a more stretched-out process of production adjustments. Even consumer spending decisions have become increasingly responsive to changes in the perceived profitability of firms through their effects on the value of households' holdings of equities. Stock market wealth has risen substantially relative to income in recent years--itself a reflection of the extraordinary surge of innovation. As a consequence, changes in stock market wealth have become a more important determinant of shifts in consumer spending relative to changes in current household income than was the case just five to seven years ago. The hastening of the adjustment to emerging imbalances is generally beneficial. It means that those imbalances are not allowed to build until they require very large corrections. But the faster adjustment process does raise some warning flags. Although the newer technologies have clearly allowed firms to make more informed decisions, business managers throughout the economy also are likely responding to much of the same enhanced body of information. As a consequence, firms appear to be acting in far closer alignment with one another than in decades past. The result is not only a faster adjustment, but one that is potentially more synchronized, compressing changes into an even shorter time frame. This very rapidity with which the current adjustment is proceeding raises another concern, of a different nature. While technology has quickened production adjustments, human nature remains unaltered. We respond to a heightened pace of change and its associated uncertainty in the same way we always have. We withdraw from action, postpone decisions, and generally hunker down until a renewed, more comprehensible basis for acting emerges. In its extreme manifestation, many economic decisionmakers not only become risk averse but attempt to disengage from all risk. This precludes taking any initiative, because risk is inherent in every action. In the fall of 1998, for example, the desire for liquidity became so intense that financial markets seized up. Indeed, investors even tended to shun risk-free, previously issued Treasury securities in favor of highly liquid, recently issued Treasury securities. But even when decisionmakers are only somewhat more risk averse, a process of retrenchment can occur. Thus, although prospective long-term returns on new high-tech investment may change little, increased uncertainty can induce a higher discount of those returns and, hence, a reduced willingness to commit liquid resources to illiquid fixed investments. Such a process presumably is now under way and arguably may take some time to run its course. It is not that underlying demand for Internet, networking, and communications services has become less keen. Instead, as I noted earlier, some suppliers seem to have reacted late to accelerating demand, have overcompensated in response, and then have been forced to retrench--a not-unusual occurrence in business decisionmaking. A pace of change outstripping the ability of people to adjust is just as evident among consumers as among business decisionmakers. When consumers become less secure in their jobs and finances, they retrench as well. It is difficult for economic policy to deal with the abruptness of a break in confidence. There may not be a seamless transition from high to moderate to low confidence on the part of businesses, investors, and consumers. Looking back at recent cyclical episodes, we see that the change in attitudes has often been sudden. In earlier testimony, I likened this process to water backing up against a dam that is finally breached. The torrent carries with it most remnants of certainty and euphoria that built up in earlier periods. This unpredictable rending of confidence is one reason that recessions are so difficult to forecast. They may not be just changes in degree from a period of economic expansion, but a different process engendered by fear. Our economic models have never been particularly successful in capturing a process driven in large part by nonrational behavior. Although consumer confidence has fallen, at least for now it remains at a level that in the past was consistent with economic growth. And as I pointed out earlier, expected earnings growth over the longer-run continues to be elevated. If the forces contributing to long-term productivity growth remain intact, the degree of retrenchment will presumably be limited. Prospects for high productivity growth should, with time, bolster both consumption and investment demand. Before long in this scenario, excess inventories would be run off to desired levels. Still, as the FOMC noted in its last announcement, for the period ahead, downside risks predominate. In addition to the possibility of a break in confidence, we don't know how far the adjustment of the stocks of consumer durables and business capital equipment has come. Also, foreign economies appear to be slowing, which could damp demands for exports; and, although some sectors of the financial markets have improved in recent weeks, continued lender nervousness still is in evidence in other sectors. Because the advanced supply-chain management and flexible manufacturing technologies may have quickened the pace of adjustment in production and incomes and correspondingly increased the stress on confidence, the Federal Reserve has seen the need to respond more aggressively than had been our wont in earlier decades. Economic policymaking could not, and should not, remain unaltered in the face of major changes in the speed of economic processes. Fortunately, the very advances in technology that have quickened economic adjustments have also enhanced our capacity for real-time surveillance. As I pointed out earlier, demand has been depressed by the rise in energy prices as well as by the needed slowing in the pace of accumulation of business capital and consumer durable assets. The sharp rise in energy costs pressed down on profit margins still further in the fourth quarter. About a quarter of the rise in total unit costs of nonfinancial, nonenergy corporations reflected a rise in energy costs. The 12 percent rise in natural gas prices last quarter contributed directly, and indirectly through its effects on the cost of electrical power generation, about one-fourth of the rise in overall energy costs for nonfinancial, non-energy corporations; increases in oil prices accounted for the remainder. In addition, a significant part of the margin squeeze not directly attributable to higher energy costs probably has reflected the effects of the moderation in consumer outlays that, in turn, has been due in part to higher costs of energy, especially for natural gas. Hence, it is likely that energy cost increases contributed significantly more to the deteriorating profitability of nonfinancial, non-energy corporations in the fourth quarter than is suggested by the energy-related rise in total unit costs alone. To be sure, the higher energy expenses of households and most businesses represent a transfer of income to producers of energy. But the capital investment of domestic energy producers, and, very likely, consumption by their owners, have provided only a small offset to the constraining effects of higher energy costs on spending by most Americans. Moreover, a significant part of the extra expense is sent overseas to foreign energy producers, whose demand for exports from the United States is unlikely to rise enough to compensate for the reduction in domestic spending, especially in the shortrun. Thus, given the evident inability of energy users, constrained by intense competition for their own products, to pass on much of their cost increases, the effects of the rise in energy costs does not appear to have had broad inflationary effects, in contrast to some previous episodes when inflation expectations were not as well anchored. Rather, the most prominent effects have been to depress aggregate demand. The recent decline in energy prices and further declines anticipated by futures markets, should they occur, would tend to boost purchasing power and be an important factor supporting a recovery in demand growth over coming quarters. Economic projections The members of the Board of Governors and the Reserve Bank presidents foresee an implicit strengthening of activity after the current rebalancing is over, although the central tendency of their individual forecasts for real GDP still shows a substantial slowdown, on balance, for the year as a whole. The central tendency for real GDP growth over the four quarters of this year is 2 to 2-1/2 percent. Because this average pace is below the rise in the economy's potential, they see the unemployment rate increasing to about 4-1/2 percent by the fourth quarter of this year. The central tendency of their forecasts for inflation, as measured by the prices for personal consumption expenditures, suggests an abatement to 1-3/4 to 2-1/4 percent over this year from 2-1/2 percent over 2000. Government debt repayment and the implementation of monetary policy Federal budget surpluses have bolstered national saving, providing additional resources for investment and, hence, contributing to the rise in the capital stock and our standards of living. However, the prospective decline in Treasury debt outstanding implied by projected federal budget surpluses does pose a challenge to the implementation of monetary policy. The Federal Reserve has relied almost exclusively on increments to its outright holdings of Treasury securities as the "permanent" asset counterpart to the uptrend in currency in circulation, our primary liability. Because the market for Treasury securities is going to become much less deep and liquid if outstanding supplies shrink as projected, we will have to turn to acceptable substitutes. Last year the Federal Reserve System initiated a study of alternative approaches to managing our portfolio. At its late January meeting, the FOMC discussed this issue at length, and it is taking several steps to help better position the Federal Reserve to address the alternatives. First, as announced on January 31, the Committee extended the temporary authority, in effect since late August 1999, for the Trading Desk at the Federal Reserve Bank of New York to conduct repurchase agreements in mortgagebacked securities guaranteed by the agencies as well as in Treasuries and direct agency debt. Thus, for the time being, the Desk will continue to rely on the same types of temporary open market operations in use for the past year and a half to offset transitory factors affecting reserve availability. Second, the FOMC is examining the possibility of beginning to acquire under repurchase agreements some additional assets that the Federal Reserve Act already authorizes the Federal Reserve to purchase. In particular, the FOMC asked the staff to explore the possible mechanisms for backing our usual repurchase operations with the collateral of certain debt obligations of U.S. states and foreign governments. We will also be consulting with the Congress on these possible steps before the FOMC further considers such transactions. Taking such assets in repurchase operations would significantly expand and diversify the assets our counterparties could post in temporary open market operations, reducing the potential for any impact on the pricing of private sector instruments. Finally, the FOMC decided to study further the even longer-term issue of whether it will ultimately be necessary to expand the use of the discount window or to request the Congress for a broadening of its statutory authority for acquiring assets via open market operations. How quickly the FOMC will need to address these longer-run portfolio choices will depend on how quickly the supply of Treasury securities declines as well as the usefulness of the alternative assets already authorized by law. In summary, although a reduced availability of Treasury securities will require adjustments in the particular form of our open market operations, there is no reason to believe that we will be unable to implement policy as required.
|
board of governors of the federal reserve system
| 2,001 | 2 |
Remarks by Mr Roger W. Ferguson, Jr., Vice Chairman of the Federal Reserve Board, at the Owen Graduate School of Management, Vanderbilt University, Nashville, Tennessee, on 14 February 2001
|
Roger W. Ferguson, Jr: E-commerce: lessons learned to date Remarks by Mr Roger W. Ferguson, Jr., Vice Chairman of the Federal Reserve Board, at the Owen Graduate School of Management, Vanderbilt University, Nashville, Tennessee, on 14 February 2001 * * * I am pleased to be here with you today at Vanderbilt University. Your e-Lab and e-commerce programs make this a particularly appropriate place to talk about the lessons that can be drawn from developments in electronic commerce over the past several years. Governor Daane, thank you for inviting me to speak. The recent past has provided an excellent opportunity to observe how businesses and individuals respond to significant technological changes. But we should also be mindful that Internet-based commerce, in particular, is still young and unquestionably evolving. Hence, I must emphasize that these are only preliminary observations. Today I will touch on three main topics. First, I will explore a few early observations about the economics of electronic commerce based on the experiences of the past few years. Second, I will talk about payments, a key part of the country's economic infrastructure, and whether the current payment instruments can effectively support future electronic commerce. Finally, I will briefly review some activities that the Federal Reserve is pursuing to remove barriers to innovations in electronic payments and commerce. A few early lessons of e-commerce The terms "electronic commerce" and "e-commerce" generally refer to commercial activity involving the Internet, although they can also describe any commerce that relies primarily on electronic exchange of information. For many entrepreneurs, e--commerce is an entirely new market opportunity, while others see it as a versatile new channel offering opportunities to enhance existing markets. Whatever the business case for an application of new technology, generally successful applications have the potential to improve the lives of ultimate consumers by reducing transaction costs. Reduced transaction costs, in turn, can broaden the array of choices, expand the size of markets, and ultimately, through competition, improve the quality of existing goods and services. An initial observation is that, despite the novelty of the Internet applications we see today, electronics have been used for commercial purposes for well over a century. In the nineteenth and twentieth centuries, installation of telegraph wires and then telephone networks created a revolution in business communications not unlike the current e-commerce revolution, broadening markets by easing communications between distant trading partners and reducing risks associated with slow physical communication and transportation. In the extreme, telephone networks enabled two distant parties to communicate interactively and in real time. Of course, there are limits to the uses of voice communications. The advent of computers and new communications technology introduced the opportunity to transmit vast quantities of data, as well as voice, over existing telephone networks. The commercial prospects of combining new and existing communications technology with new information management technology, both software and hardware, spawned the investment boom that underpins, in part, the most recent revolution in e-commerce. But we must be careful to recognize that the rapid evolution of modern e-commerce does not repeal the laws of economics. In fact, we see now that economies with significant investments in information and communications technology remain subject to occasional capital goods overhangs, which may influence macroeconomic conditions. Over the last few months, data on orders and shipments of nondefense capital goods have provided hard evidence of a slowdown in business spending on high-tech capital goods. Our economy is clearly undergoing a stock adjustment to bring the supply of and demand for capital goods in some sectors into better alignment. Importantly, measures of growth of output per hour in the second half of 2000 were sufficiently strong to suggest that the growth rate of structural productivity remains robust. This in turn suggests that the rate of return on capital should be sufficiently attractive to foster new investment once this stock adjustment is complete. There are two key questions. First, when will the stock adjustment in high tech capital run its course, and the supply and demand for capital goods return to balance? Second, when balance is restored, what pace of investment in high tech capital goods will ensue? Unfortunately, neither question is answerable with certainty at this stage. With respect to the duration of the stock adjustment, those who think that the process will be protracted point to both the length of the current investment boom and the historical experience with lengthy stock adjustments in capital goods to suggest that the period of retrenchment will be a long one. Those who are optimistic that this phase of rebalancing will be relatively short highlight two facts. The adjustment in capital goods ordering and production has been relatively rapid in this cycle and modern high tech capital goods are relatively short-lived--being depreciated in many cases in three years or less as opposed to the seven years or more that characterizes many types of traditional capital equipment. Which of these sets of factors predominates will determine, in part, the shape of the recovery from this period of slowing. Similarly, we cannot know with certainty the pace of investment in capital goods going forward. As I will discuss below, it is certain that the pace of future demand for capital goods will depend in part on the ability of providers of capital--banks, creditors in fixed income markets, and purchasers of equity-to recognize the risks inherent in high tech capital investment plans and to price the risk appropriately. Let me now offer a few observations on cost, demand, and the microeconomics of e-commerce. The cost structure of electronic networks tends to be characterized by high fixed costs and very low marginal operating costs. This also appears to be true of the cost structure of a number of firms engaged in e-commerce. Initially, purchasing or developing software to support a competitive commercial enterprise on the Internet can be costly. But once software that meets a market demand is built, it can be paired with scalable hardware to handle significant additional volume for very little extra cost. It appears, however, that the basic cost structure of e-commerce has different applicability for different types of businesses in this sector. For example, the high fixed cost, low marginal cost model may fairly accurately characterize the cost structure of companies that provide on-line information services, such as information vendors, search engines, and electronic communications networks. Those that produce their own information content clearly pay some production costs, but those costs appear to be small compared with the cost of building and operating a network. Interestingly, many information providers do not originate any content at all and rely instead on markets, commercial partners, or even subscribers to provide the content. For different reasons, information-based e-commerce companies seek high sales volumes. They all wish to take advantage of the economies of scale that are inherent in their cost structure, seeking large scale to reduce their average production costs as volumes grow. As with other businesses that have this cost profile, e-commerce businesses in this category often respond to their cost structure by charging a flat fee per user, such as a subscription fee, with the price structure often transitioning to some form of discounted fee for heavy volume accounts. Obviously for those information-based e-commerce businesses for which advertising is the major source of revenue, scale is important to keep advertisers happy. Overall, even with rapidly declining marginal costs, if pricing does not cover the marginal costs and revenues in the long run do not recoup fixed costs, this model of e-commerce can prove financially disappointing. On the other hand, the cost structure for those e-commerce firms that use electronic means to distribute tangible goods, such as books, apparel, and toys, appears to mirror more conventional business models. The network costs for these firms reflect the well-understood high fixed cost and low marginal cost model of the electronic world. However, the economics of fulfillment--that is, providing and servicing the goods--still depend on these businesses' ability to achieve efficiencies and low unit costs for materials, storage, distribution, and after-sales service. The story of the demise of one prominent Internet-only retailer may be instructive in this regard. News reports indicate that the company had to build and maintain a web site costing about $40 million annually, the high fixed cost element of e-commerce, which it thought was required to achieve the desired revenue in the national market it hoped to serve. Besides this new economy cost, this retailer decided to build a proprietary distribution network, an old economy cost that, according to public sources, raised its investment in property and equipment to more than $100 million, a fourfold increase in one year. Analysts indicate that this equaled around 100 percent of the firm's 1999 revenues. As a benchmark, for land-based retailers a comparable number would be 20 percent of annual revenue and for catalogue retailers, who often subcontract distribution, the comparable number would be 12 percent to 13 percent. Finally, because of low barriers to entry to the Internet market and the low cost to customers of switching from one seller to another, Internet-only firms appear to face high costs to obtain and retain customers. Again, published reports indicate that the retailer increased its budget by 30 percent in one year as new competitors moved into its market. Clearly, appropriately scaling the cost model to the market potential is another key lesson in the world of e-commerce. On the demand side, the so-called network effect is extremely important for some e-commerce businesses because the value of some services increases as the number of customers using them increases. The most obvious example is an on-line auction site, in which the more buyers and sellers using an auction site, the deeper the liquidity--that is, the greater the number of opportunities to trade and the greater the likelihood that trades will occur. For these firms, which include those that support the "auction" of equities as well as collectibles, high volume is critical to their success, and volume expectations appear to influence investments in these firms. I have already referred to economies of scale. Auctions and chat rooms provide an example of how network effects and scale economies can be mutually reinforcing, making high transaction volumes critical to both the supply and the demand sides of the market. Here let me also note that, for those for whom advertising is a key source of revenue, there is a virtuous cycle as a large customer base attracts more advertisers, which in turn finances more and better content and attracts even more users. The need for scale or volume appears to create an advantage for the first business to achieve critical mass in any market in which there are strong network effects. Understanding this dynamic and taking advantage of it appears to be another lesson from the early experience with e-commerce. The evolution of investment in e-commerce firms, particularly web-only firms, continues to receive attention in the press. Needless to say, rational economic behavior suggests that investors would require a high return to invest in unproven but potentially profitable endeavors. After all, some innovations struggle but succeed, while others arrive too soon for the technology, arrive too soon for the market, or are not commercially successful for a wide variety of other reasons. The rapid reassessment of the business prospects of some e-commerce firms during the course of last year is a reflection of this reality. Of course, the equity securities of these firms were revalued to reflect these changing assessments. Are stocks today overvalued, correctly valued, or undervalued? I certainly do not know, and I am not aware of anyone who does. As a result, I believe that it would be unwise--and indeed impossible--for the Federal Reserve to target specific levels of valuations in equity markets. However, valuation methods that are appropriately sensitive to the obvious business risks of e-commerce, as opposed to being driven by the assumption of the most optimistic outcomes for every concept, are key. In the long run such approaches should provide a healthy base for maintaining a reasonable and sustainable pattern of growth and investment in the e-commerce segment. Costs of capital that reflect risks accurately are critical to a well-functioning economy. Therefore, with respect to observations and lessons, it appears that the basic rules of economics, commerce, and finance continue to apply. Though some macroeconomic conditions have changed importantly because of the technology investments that underpin e-commerce, the laws of economics have not been repealed. At the commercial level, any company considering a substantial investment needs to understand the business case and underlying market cost, competition, and demand structures. Companies and their investors still need to assess the potential risks and returns based on that commercial reality. Payments and e-commerce Now I would like to turn to the topic of payment systems and discuss whether the existing arrangements support electronic commerce. In many ways the rapid growth of some elements of e-commerce is built on the solid base of preexisting payment systems and protocols. Even with the apparent ups and downs of specific electronic commerce providers, many purchases are being initiated through the Internet. The Census Bureau estimates that roughly $20 billion worth of retail transactions flowed over the Internet during the year ending September 2000, excluding largedollar business-to-business transactions at least partly initiated through the Internet. Some private calculations reach twice that amount. These purchases are being paid for predominantly with traditional payment instruments that predate the World Wide Web. Given the growing importance and apparent potential of e-commerce, it is important that the older protocols of the payment system evolve to support this new element of our economy. To explore recent payment developments, it may help to distinguish among the markets for different types of payment transactions. Although the consumer-to-business (C2B), person-to-person, and business-to-business (B2B) categories likely break down when pushed too far, they can provide a convenient organizing framework for identifying payment transactions with some common characteristics. For some types of commerce, existing electronic payment instruments were easily adapted to the Internet. Most notably, small- to medium-sized C2B purchases are frequently made using credit cards. Because credit cards were already widely used for retail telephone transactions, these "card-notpresent" transactions were easily accepted as part of commerce on the World Wide Web. Moreover, unlike many other payment instruments, credit cards could already support low-value international commerce, one of the historical barriers being challenged by the Internet. One card network estimates that 95 percent of retail purchases over the Internet in 1999 were made using debit, credit, and other payment cards. Some entrepreneurs are adapting other payment instruments for C2B electronic commerce. For example, debit card networks are exploring ways to enhance security and inter-network arrangements so that PIN-based debit transactions will be widely accepted as an alternative to credit cards for Internet-based sales. Similarly, vendors have developed pre-paid cards for which value can be purchased in advance and used to pay for on-line purchases by individuals who do not have access to credit cards or who prefer not to use them. In addition, some service providers have begun to offer "electronic check" or "e-check" products in which customers enter the information shown on the bottom line of a check and authorize the electronic debit of their checking accounts through a mechanism called the automated clearinghouse, or ACH. But existing and evolving payment instruments do not yet satisfy all of the needs of C2B e-commerce transactions. For instance, many firms provide bill payment services, and many are exploring ways to present bills electronically as well. Despite the growth of electronic bill payment applications, many of the bills for which payment instructions are initiated on-line are still paid by check. Similarly, there is not yet an easy way to pay for transactions such as on-line stock purchases, which have become popular. Instead, these purchases are generally charged against pre-funded brokerage accounts, although they could also be paid for by wire transfers through the purchaser's bank or by the prompt mailing of checks. Devising new ways to pay for securities trades is becoming increasingly important as the securities industry tries to reduce the time needed to settle trades from trade-date plus-three days to trade-date plus one day. For person-to-person commerce over the Internet, typically conducted through auction and similar web sites, there are very few electronic payment alternatives that can be easily transferred from physical commerce. Over the past few years, however, several service providers have created Internet-based person-to-person transfer mechanisms based on the credit card clearing mechanism but requiring transfer of funds through an intermediary. Other service providers enable individuals to accept credit card payments, a function previously available only to businesses. Finally, a number of service providers are also trying to address the market for electronic B2B transactions. Thus far, some companies have adopted corporate purchasing cards, issued by traditional card-issuing companies, for their low- and medium-value on-line purchases. The demand for improved payment instruments for B2B transactions, however, may be even greater than for the C2B and person-to-person markets. Improving speed, reducing risk, and ensuring appropriate levels of privacy are important in all three markets. Also, to attract users, B2B payment mechanisms may need to provide additional features--for example, tools that reduce credit and timing risks in domestic and international markets. Other desirable features might enable data to flow seamlessly through the internal systems of the purchaser, the seller, and perhaps intermediaries--all at a low cost, of course. Because the products or services, scale, and complexity of business-to-business transactions vary widely, satisfying the needs of this diverse market may be more difficult than satisfying those of the other markets. I have heard reports, however, that many banks and other organizations are aggressively seeking ways to provide services for this market. The Federal Reserve in the payment system Now, turning to the role of the Federal Reserve, recall that one reason that the Congress established the Federal Reserve was to improve the nation's payment system; the Federal Reserve Act of 1913 provides the foundation for the Federal Reserve to establish a national check-clearing system. Today, the Reserve Banks distribute cash, clear checks, and provide electronic payment services to banks. In addition, the Federal Reserve System has had a long-standing role in helping to formulate public policies that improve the overall efficiency of the nation's payment system and reduce risks. Recently, evolving technology and growth in alternatives to cash and check payments have raised questions about the Federal Reserve's role in the payment system. In 1996 and 1997, a committee headed by Alice Rivlin, then the Federal Reserve Vice Chair, studied the Federal Reserve's operational role in the payment system. Ultimately, the study concluded that the Federal Reserve should continue to provide all its existing payment services with the explicit goal of enhancing efficiency, effectiveness, and convenience, while ensuring access for all banks. The study also recommended that the Federal Reserve work actively, closely, and collaboratively with providers and users of the payment system, both to enhance the efficiency of check and ACH services and to help develop strategies for moving to the next generation of payment instruments. To follow up on the recommendations of that study, the Federal Reserve created the Payments System Development Committee, which I co-chair with Cathy Minehan, President of the Federal Reserve Bank of Boston. The Payments System Development Committee has an explicit mission. In addition to identifying strategies for enhancing the long-term efficiency of the retail payment systems, it also identifies barriers to innovation and works to address those barriers where possible. The committee is active in monitoring developments in payment markets and has sponsored workshops and forums that encourage focused discussions with the private sector. Our current activities include efforts to reduce legal and regulatory barriers to payment innovation, examine future clearing and settlement systems to support electronic commerce, assess our role in helping to set standards, and find ways to use new technologies to collect checks more efficiently. Conclusion The technological developments that enable us to engage in electronic commerce today have created tremendous opportunities to improve the ways in which we do business. Even as some businesses fail, they are contributing to our store of knowledge about what will and will not work in e-commerce. But beyond just the success or failure of specific businesses, electronic commerce has challenged the thinking of entrepreneurs and of those who lead traditional businesses. The developments in ecommerce have reminded us that change, even rapid change, is part of the normal evolution that we expect from market economies. They have also shown, however, that no matter what delivery mechanism is used, successful businesses must still follow good business practices, pay attention to basic economic principles, and sell products and services that buyers want. I believe this to be true for business generally and certainly for payments. A market-oriented approach to payment system innovation promises to provide long-lasting benefits to the consumers and businesses that use the U.S. payment system. We need to approach payment system innovations with an open mind and a willingness to learn. This is particularly true in the world of electronic commerce, where payments are being adapted to new technologies, products, and methods of doing business. These innovations are important in themselves. But they are also important because successful innovations to support electronic commerce may, over the long term, have a broad influence on the payment systems we use throughout our economy. I commend Vanderbilt University and its Owen Graduate School of Management for moving so forcefully to train the next generation of leaders in the e-commerce world. Not only are you serving your students; you are serving the global economy. Thank you for your attention.
|
board of governors of the federal reserve system
| 2,001 | 2 |
Remarks by Mr Roger W. Ferguson, Jr., Vice Chairman of the Federal Reserve Board, at a conference sponsored by the Securities Industry Association and the University of North Carolina School of Law, New York, New York, 27 February 2001
|
Roger W. Ferguson, Jr: Understanding financial consolidation Remarks by Mr Roger W. Ferguson, Jr., Vice Chairman of the Federal Reserve Board, at a conference sponsored by the Securities Industry Association and the University of North Carolina School of Law, New York, New York, 27 February 2001 * * * Good afternoon. It is my pleasure to speak to you today, and I thank the Securities Industry Association and the University of North Carolina School of Law for inviting me to participate in this conference. Consolidation of all types of business activities has been a prominent feature of the economic landscape for at least the past decade. The financial sector has participated actively in this development. Indeed, the last few years have witnessed an acceleration of consolidation among financial institutions. Thus, your choice of topics for this afternoon's session is a timely one. In recognition of the importance of this marketplace evolution, and especially its potential effects on a wide range of public policies, the finance ministers and central bank governors of the Group of Ten nations in September 1999 commissioned a major study of the possible effects of financial consolidation on matters of policy concern to central banks and finance ministries in the G-10. This study, which I was privileged to direct, was released to the public late last month. Today I would like to discuss the study's major findings and their implications. Before reviewing these findings, however, let me make a few comments about current economic conditions. As always, the views I will be expressing are my own and are not necessarily shared by other members of the Board of Governors or of the Federal Open Market Committee. Current economic conditions Recent data on the U.S. economy confirm that a significant deceleration in activity has occurred. Our economy is clearly undergoing a stock adjustment to bring the supply of and demand for inventory and capital goods in some sectors into better alignment. One key question is: When will the stock adjustment run its course? Unfortunately, this question is not answerable with certainty at this stage. The predominant risk remains that growth will be notably slower than would be consistent with the economy realizing its full potential over time. One factor pointing to a risk of unacceptably slow growth is the uncertainty about consumer sentiment, which has been a feature of the recent period. It is therefore useful to explore more deeply economists' understanding of consumer confidence. Usually, consumer sentiment closely mirrors contemporaneous economic conditions, and it generally moves closely with the growth in household spending. Not surprisingly, the key influences on household spending, such as the growth of personal incomes and the buoyancy of financial markets, also appear to bear importantly on consumer sentiment. Relative to other indicators for the household sector, the indexes of consumer sentiment are available on a very timely basis and so can provide an early reading on the direction of household spending. Moreover, while it is a contemporaneous indicator of household spending, sentiment also seems to have some limited predictive ability for future household spending. These advantages notwithstanding, we are, of course, continually attempting to evaluate those numbers in the context of the entire range of other indicators for the household sector. A somewhat puzzling feature of the recent period has been that, despite the sharp weakening in sentiment, household spending appears thus far to have held up well. How these apparently conflicting signals will be resolved going forward is not at all apparent from today's vantage point, and will bear close scrutiny. Another recent development has been some adverse movement in inflation rates. Recent inflation changes have been influenced importantly by increases in energy prices. Prices of petroleum-related products increased notably through the first half of last year, followed more recently by sharp increases in prices of natural gas and, in some regions, electricity. Largely reflecting these developments, consumer price inflation has accelerated appreciably over the past year to 3-3/4 percent over the twelve months ended in January. The Consumer Price Index excluding food and energy has accelerated as well over the past year, but much more modestly than the overall index, to a rate of increase slightly above 2-1/2 percent. Energy prices likely contributed to the price increases of non-energy items through their effect on transportation costs and other inputs to production. Looking forward, participants in futures markets expect crude oil prices to continue to move down. Spot natural gas prices are well off their January highs and are expected to move still lower. If these expectations prove accurate, this should help contain inflation through both these direct and indirect channels. Furthermore, any easing of resource utilization associated with relatively damped economic growth should also restrain price pressures going forward. In the current environment, monetary policy faces the short-term challenge of discerning the uncertain dimensions of the current slowdown in economic growth and responding appropriately. Any such response, however, will be made in the context of our continued commitment to an unchanged longterm mission--namely, the promotion of maximum sustainable employment through pursuit of price stability over time. The G-10 study of financial consolidation Let me now turn my attention to the topic of this afternoon's program, namely consolidation in the financial sector. The G-10 study had two primary objectives. It attempted to isolate the effects of consolidation from those of other powerful forces transforming our financial systems and to identify key areas in which financial consolidation requires new or accelerated policy development. The diversity of the economies involved--even among the G-10, Australia, and Spain--and the interdependent nature of many of the forces affecting our financial systems made achieving these objectives difficult, to say the least. However, I believe the study was a success. Patterns and causes With a study of the depth, breadth, and quite frankly, the length of this one, it is always potentially dangerous and even possibly misleading to summarize the key points in a few words. However, I believe that policymakers should communicate to a wide audience their thinking on important policy concerns, and thereby stimulate and contribute to dialogues in the public and private sectors. Thus, despite the risks, I would like to highlight what are, in my judgment, the study's key findings and policy implications. The report documents that, in the nations studied, a high level of merger and acquisition activity occurred during the 1990s among financial firms, defined to include depository institutions, securities firms, and insurance companies. During the decade, approximately 7,500 transactions, valued at roughly $1.6 trillion, were consummated. Moreover, the pace of consolidation increased over time, including a noticeable acceleration in the last three years of the decade. For example, the annual number of deals increased threefold during the 1990s, and the total value of deals increased almost tenfold. In Europe, roughly two-thirds of merger and acquisition activity, as measured by the value of the European firm acquired, occurred during the decade's last three years. Using a variety of measures, the United States accounted for about 55 percent of M&A activity, partly because of our historically large number of relatively small financial firms. However, it is also true that many very large U.S. banking institutions expanded their geographic footprint by acquiring other very large banks, especially later in the decade. Most of the last decade's merger and acquisition activity in the financial sector involved banking organizations. Acquisitions of banking firms accounted for 60 percent of all financial mergers and 70 percent of the value of those mergers in the nations studied. In addition, most M&A transactions involved firms competing in the same segment of the financial services industry within the same country, while domestic mergers involving firms in different segments of the overall financial services industry were the second most common type of transaction. Cross-border mergers and acquisitions were less frequent, especially those involving firms in different industry segments. Still, all types of mergers and acquisitions, whether within one country or cross-border and whether within one industry segment or across segments, increased in frequency and value during the 1990s. Joint ventures and strategic alliances provide an interesting contrast with some of the patterns in outright mergers and acquisitions. As with M&A activity, the number of joint ventures and strategic alliances increased during the 1990s, with especially large increases in the last two years. In the United States, which accounted for nearly half of all joint ventures and strategic alliances, the arrangements were overwhelmingly domestic. However, in the other twelve countries studied, crossborder joint ventures and strategic alliances overall exceeded domestic deals. Our research shows that financial consolidation substantially decreased the number of banking firms during the 1990s in almost every nation studied, and measures of the national concentration of the banking industry have tended to rise. Still, at the national level, the structure of the banking industry continues to differ greatly, ranging from very unconcentrated in a few nations – the United States and Germany – to highly concentrated in about half of the nations in our study. In contrast to banking, there are no consistent patterns across countries in changes in the number of insurance firms or concentration in the insurance industry during the 1990s. Within the securities industry, several specific activities, such as certain types of underwriting, are dominated by a small number of leading institutions. It is unclear, however, whether this pattern changed much over the 1990s. One of the most important conclusions of our study is that financial consolidation has helped to create a significant number of large, and in some cases increasingly complex, financial institutions. In addition, these firms increasingly operate across national borders and are subject to a wide range of regulatory regimes. These observations have several important implications that I shall return to in a moment. Our work finds that the most important forces encouraging financial consolidation are improvements in information technology, financial deregulation, globalization of financial and nonfinancial markets, and increased shareholder pressure for financial performance. Because we expect these forces to continue, we expect financial consolidation to continue as well, even though the pace may be interrupted by swings in the macroeconomic cycle and other factors. The study considers few possible future scenarios but concludes that the likelihood of specific future developments is impossible to assess with confidence. My own guess is that various patterns will emerge. Globally active universal financial service providers will continue to emerge. We should also see the further development of firms specialized in the production of particular components of financial services or in the distribution to end-users of products obtained from specialized providers--providers that may exist within or outside the traditional financial services industry. I fully expect a large number of efficient and profitable small and medium-sized financial institutions to remain important players in the United States. I would guess this will also be the case in many other nations. In addition, the uncertainties of successful post-merger integration may well favor more use of looser forms of consolidation, such as joint ventures and strategic alliances. Monetary policy One of our more important policy concerns in designing the study was the potential effect of financial consolidation on the conduct and effectiveness of monetary policy. The study finds, however, that financial consolidation has not significantly affected the ability of central banks to achieve the objectives of monetary policy. Why is this? Although the answer is somewhat complex, let me try to explain briefly. As part of our research, we asked central banks in all the study nations about their experiences with consolidation and monetary policy. Virtually all reported that they had experienced at most minor effects, and those that had experienced somewhat stronger effects had been able to adjust with little difficulty. A key reason for this finding is that even with the substantial consolidation we have observed, the financial markets important for monetary policy have generally remained highly competitive. Even in those nations where consolidation has been considerable, competitive behavior has generally been sustained by the possibility that new firms could enter the markets at relatively low cost. It is also well worth noting that our work suggests that the development of the euro has been particularly helpful in maintaining competition in Europe. The euro has encouraged development of European money and capital markets, thus making the number of participants in a particular nation's markets less relevant. Consolidation could, at least in theory, affect the way changes in monetary policy are transmitted to the real economy. For example, consolidation could potentially alter the way banks adjust the availability and pricing of credit to their customers as the central bank changes the stance of monetary policy. However, central banks generally indicated that such effects had not been observed. Moreover, frequent reviews of the data should allow central banks to take account of any future changes when setting policy. On balance, and despite these quite positive results, our study recommends that central banks should remain alert to the implications of any future reductions in the competitiveness of the markets most important for monetary policy implementation. Similarly, we suggest that central banks ought to monitor potential future effects on the transmission mechanisms for monetary policy. Monetary policy is simply too important to the health of all our economies to do otherwise. Financial risk Financial consolidation can affect the risks to both individual financial institutions and the financial system as a whole. Importantly, our study concludes that existing policies appear adequate to contain individual firm and systemic risks now and in the intermediate term. However, looking further ahead, the study identifies several topics that deserve careful attention by policymakers. For example, we conclude that the potential effects of financial consolidation on the risk of individual financial institutions are mixed and that the net result is impossible to generalize. Thus, we must evaluate individual firm risk on a case-by-case basis. Consolidation seems most likely to reduce risk through diversification gains, although even here the possibilities are complex. On the one hand, diversification gains seem likely from consolidation across regions of a given nation and across national borders. On the other hand, after consolidation some firms shift toward riskier asset portfolios, and consolidation may increase operating risks and managerial complexities for those firms. Diversification gains may also result from consolidation across financial products and services, although research suggests that the potential benefits may be fairly limited. In part because the net impact of consolidation on individual firm risk is unclear, the net impact of consolidation on systemic risk is also uncertain. However, as I noted consolidation clearly has encouraged the creation of a number of large and increasingly complex financial institutions. Our study suggests that if such an institution became seriously distressed, consolidation and any attendant complexity might increase the chance that winding down the organization would be difficult or disorderly. We recommend that the risks to individual firms and to the financial system could be reduced by stepped-up efforts to understand the implications of working out a large and complex financial institution. Because no institution is too big to fail, I believe that regulators should develop a clearer understanding of, for example, the administration of bankruptcy laws and conventions across borders; the coordination of supervisory policies within and across borders; the treatment of over-the-counter derivatives, foreign exchange, and other "market" activities in distress situations; the roles and responsibilities of managers and boards of directors; and the administration of the lender-of-last-resort function. I say stepped-up discussions are needed in some of these areas because considering adverse developments is or should be a normal activity in all countries. Our study helped to clarify the need for international attention to this topic. Consolidation, and especially any resulting increased complexity of financial institutions, appears to have increased both the demand by market participants for and the supply by institutions of information regarding a firm's financial condition. The resulting rise in disclosures has probably improved firm transparency and encouraged market discipline and has thus lowered individual firm risk and perhaps increased financial stability. However, the increased complexity of firms has also made them more opaque, and their increased size has the potential to augment moral hazard. Thus, the net effect of consolidation on firm transparency and market discipline is unclear. Indeed, we conclude that there appears to be considerable room for improvement in disclosures by financial institutions. Our study suggests that both crisis prevention and crisis management could be improved by additional communication and cooperation among central banks, finance ministries, and other financial supervisors, domestically and internationally. Indeed, the study strongly supports existing efforts in these areas. In our view, the most important initiatives include proposals to improve the risk sensitivity of the international Basel Capital Accord and bank supervision and efforts aimed at improving market discipline. A critical element of improved risk-based supervision is risk-based capital standards that are tied more closely to economic risk. Capital standards provide an anchor for virtually all other supervisory and regulatory actions and can support and improve both supervisory and market discipline. For example, early intervention policies triggered by more accurate capital standards could prove to be important in crisis prevention. Payment and settlement systems Financial consolidation is affecting the market structures for payment and securities settlement as well as banks' internal systems and procedures for payment and back-office activities. Our study concludes that, on balance, financial consolidation has led to a greater concentration of payment and settlement flows among fewer parties. Fortunately, our analysis indicates that the greater concentration of payment flows does not appear to have decreased competition in markets for payment and settlement services. However, we suggest that it would be advisable for government authorities to continue to monitor competition in the payment system. In contrast, our work indicates we should closely monitor the risk implications of consolidation in payment and settlement systems. On the one hand, consolidation may help to improve the effectiveness of institutions' credit and liquidity risk controls. For example, increased concentration of payment flows may allow institutions to get a more comprehensive picture of settlement exposures or create a greater ability to net internal payment flows. In addition, central banks have made major efforts over recent decades to contain and reduce systemic risk by operating and promoting real-time gross settlement systems and by insisting on the implementation of risk control measures in net settlement systems. On the other hand, consolidation may lead to a significant shift of risk from interbank settlement systems, where risk management may be more robust and transparent, to customer banks and third-party service providers, where risk management practices may be harder for users to discern. In addition, to the extent that consolidation results in a greater concentration of payment flows, the potential effects of an operational problem may increase. These and other developments imply that central bank oversight of the risks in interbank payment systems is becoming more closely linked with traditional supervision of individual institution safety and soundness. As a result, we conclude that increasing cooperation and communication between banking supervisors and payment system overseers may be necessary both domestically and internationally. Efficiency, competition, and credit flows Our study concludes with an extensive evaluation of the potential effects of financial consolidation on the efficiency of financial institutions, competition among such firms, and credit flows to households and small businesses. The study determines that although consolidation has some potential to improve operating efficiency, and has done so in some cases, the overall evidence in favor of efficiency gains is weak. Thus, we suggest that policymakers should carefully examine claims of substantial efficiency gains in proposed consolidations, especially in cases where a merger could raise significant issues of market power. Our work also attempts to shed some light on why academic researchers are less optimistic than business practitioners regarding the potential for consolidation to lead to efficiency gains. We suggest four possible reasons, which are not mutually exclusive. First, practitioners may consider cost reductions or revenue increases per se to be a success, without also taking into account independent industry trends as a benchmark. Second, managers may focus on absolute cost savings rather than on efficiency measures that compare costs to some other variable such as assets or revenues. Third, research finds little or no efficiency improvements on average, but this also means that some institutions may improve efficiency while some suffer from lower efficiency. Managers with inside knowledge of their firm may be justified in believing that their institution might be among those improving efficiency through a merger or acquisition. Lastly, past M&As may have suffered from regulations that reduced the benefits, and such regulations may not exist in the future. The effects of consolidation on competition and credit flows are case-specific and depend on the nature of markets for individual products and services. Some markets, such as those for wholesale financial services, generally show few problems. Others, such as those for retail products and services, sometimes experience problems from consolidation. Thus, as with other issues addressed by our study, a case-by-case evaluation of the relevant facts is required. Conclusion In conclusion, financial consolidation clearly is a powerful force that is deeply affecting the evolution of the financial system of the United States and many other nations. A thorough understanding of this force and its potential effects is critical for prudent decision-making in both the public and the private sectors. I believe the study that I have just summarized takes some major steps toward that understanding, and I hope that my remarks have helped you to comprehend our study's findings and implications. Still, all of us have much to learn, and much of what we know today will almost surely change in the future. I commend the organizers of this conference for seeking to advance our knowledge, and I again thank you for inviting me to contribute.
|
board of governors of the federal reserve system
| 2,001 | 3 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.